Can we trust the Bible?


  • Question from JB, Australia

    I took part in an intriguing discussion when someone asked what was meant by the concept of the “inerrancy of the Scriptures”. I started off the responses, giving an answer, but later ended up modifying my view. I had previously thought it reasonable to subscribe to a belief in the inerrancy of Scriptures. Even though my view of the Bible has not essentially changed, it was a big step for me to decide that I can no longer accept the term “inerrant” as describing the Bible.

    The reason it was a big step is that so many people believe in biblical inerrancy – and defend that principle as necessary to honour God – that anyone who disagrees is likely to be seen by many to be a liberal heretic. And yet my rejection of the term is only because on factual grounds the term “inerrant” is an inaccurate way to describe the Bible. I hope others will share my view that such a stance does nothing to dishonour the Word of God.

    Would you like to share any thoughts on that?

    The term ‘inerrancy of Scripture’ is a surprisingly late concept that seems to have developed in the nineteenth century. Given the standard evangelical position that the Bible is the ultimate source of authority for everything from moral conduct through to church structure, it is only natural that a doctrine like this should become fully articulated.

    The problem originates with the belief in the ‘divine inspiration of Scripture’. What exactly is meant by that phrase has been a cause for argument down through the centuries, with extremes ranging from the liberal ‘this is what the community of faith produced’ (i.e. entirely human authorship with everything that implies), through to the ultra-evangelical concept that reduces the human element in writing Scripture to that of just holding the pen. Most theologians would occupy a mid-point ‘in tension’ between the two extremes.

    What the earliest theologians wanted to assert was that Scripture was without deception (a radically different concept to the idea that it is without error, yet covering the same ground). So, for example, if God is described as ‘good’, then God must be good. The truth of Scripture lies in the ‘broad strokes’. This was all very well in the era of Neo-Classical Philosophy, but when the Enlightenment dawned and Scripture began to be subject to scientific criticism serious questions began to be asked.

    It was in the realm of historical criticism that the most obvious attacks were made. A famous example includes the ‘census’ in Luke chapter 2. Caesar Augustus and Quirinius were real historical people, but no records of this census survive. The obvious question was ‘did it happen?’ As doubts began to form about the historical accuracy of some of the Biblical material (and much of it has no external authentication available), then similar doubts began to surface about the larger themes – resurrection, salvation, even the existence of Christ.

    The backlash against criticism resulted in the doctrine of Scriptural Inerrancy – the statement that all of Scripture was true because it was the Word of God. This circular piece of reasoning was gradually dropped as evidence mounted up for inaccuracies and it was replaced by one of the most redundant doctrines ever in the history of Christian theology. This is the idea that the Bible was without error in the original texts, which have now been lost, so are no help to us anyway. Why anyone would bother to subscribe to a doctrine with no practical use remains a mystery, yet as you say it has become a key part of many evangelical ‘statements of faith’.

    One of the key themes in recent Biblical study has been attempting to understand the context of the original audience for the material. Scholar Gordon Fee makes a distinction between ‘exegesis’ – determining what the Scripture meant to the people who originally heard it – and ‘hermeneutics’ – applying Scripture to a ‘here-and-now’ situation (How to Read the Bible for All Its Worth, Zondervan, 3rd edition 2003, chapter 1). Fee writes: “In speaking through real persons, in a variety of circumstances, over a 1,500 year period, God’s Word was expressed in the vocabulary and thought patterns of those persons and conditioned by the culture of those times and circumstances” [Fee, op cit p 23]. The challenge where we are in history is to apply the broad strokes of those specific expressions of divine revelation.

    As part of the search for the causes of Scripture (what prompted these people to write these things?), there is a growing appreciation for the fact that the Bible is both the testimony to God’s revelation to certain people and also a potential revelatory vehicle to every person. The Holy Spirit is both the inspirer of Scripture and the interpreter of Scripture, bearing in mind the caveat employed by Fee: “A text cannot mean what it never meant… when it was first spoken” [op cit p 30, although Fee does allow an exception in the case of prophecy].

    The Bible was produced within the believing community and later the believing community acknowledged the authority it contained with the introduction of a set canon. At no point in the canon-forming process was absolute inerrancy demanded of the books included. The only questions asked regarded the inspiration of the books and their use within the community – with their use often being the ‘clincher’ in any argument.

    The key question to be faced is ‘Does the Bible have to be correct in every detail to be true?’ The alternative way of asking that is ‘Do minor (and generally they are minor) inaccuracies really cast any doubt on the central theme of the Bible, namely God’s desire to see human beings saved?’ It is the opinion of this theologian that the answer is ‘No’ on both counts.

    However, given the polemical nature of many who subscribe to the doctrine of Scriptural Inerrancy, it is highly likely to continue to feature in the many protestant creeds that exist.

    Thanks for contributing to freelance theology, JB.

    Posted on