The Center for the Study of New Testament Manuscripts (CSNTM) is proud to announce the completion of our digitization project at the National Library of Greece (NLG)! Beginning in 2015 and continuing into 2016, we have spent months working at the National Library digitizing their entire collection of Greek New Testament manuscripts. This collection is one of the largest in the world and has a multitude of priceless treasures, which are now digitally preserved for generations to come… Read More at CSNTM
I have written a new blog post over at CSNTM regarding our most recent trip to Athens and a rediscovery of a lectionary at the National Library of Greece. Make sure to check it out!
Editor’s note: Robert D. Marcello, CSNTM’s Research Manager, has written a blog for my site. I approve this message.
Daniel B. Wallace
By: Robert D. Marcello
I recently came across Greg Gilbert’s article, “Debunking Silly Statements About the Bible: An Exercise in Biblical Transmission” at the Gospel Coalition, which is an excerpt from his book Why Trust the Bible? by Crossway. Since I work in the field of textual criticism every day, I am keenly interested in how people present this oftentimes difficult material to a lay audience. I began reading his article and found myself agreeing with his points even nodding my head in agreement to the claims. There is much that is helpful in this book and I’m always encouraged when Christians are thinking seriously about the text of the New Testament. However, some minor mistakes turned into major ones as I kept reading. Below is a list of some of the more glaring errors. This brief treatment is meant to highlight some of the significant issues that continue to surface in apologetic works.
- In an obvious attempt to write on a popular level Gilbert refers to the “pieces of paper that Luke, John, or Paul used to write Luke, John, or Romans.” Tischendorf even does this when speaking about Sinaiticus being thrown in the fire in his When Were Our Gospels Written? An Argument, with a Narrative of the Discovery of the Sinaitic Manuscript. Tischendorf was writing a popular work, in a language not his own, before the nature of the material of the manuscripts was widely known. None of this is true with the present book. Nevertheless, as I am sure Gilbert is well aware, paper manuscripts were not used until very late in the transmission of the text. Papyrus is what the “originals” were likely written on and parchment didn’t become popular until the 4th century. Paper, on the other hand, wasn’t used regularly until the 14th and 15th centuries. In fact, the author uses the mistaken illustration 8 different times. When writing an apologetic piece as opposed to a casual narrative, I find it helpful to make sure it is both readable and accurate even in the finest details. One never wants to have someone who is questioning or doubting to have any reason not to trust a book or article.
- Moving on from that minor issue, Gilbert’s sub-section “Mind the Gap” is where the major errors begin. First, Gilbert attempts to mitigate the “gap” by pointing out that Vaticanus was used for a long period of time, showing that the current manuscripts do not have a multi-generational distance from the original. These claims alone are debatable; even so, this is a false analogy. Vaticanus is a codex on parchment and very nice parchment—a much more durable substance than papyrus and not at all representative of the average manuscript. Papyrus, as was previously discussed being the material of these manuscripts, is very fragile and easily deteriorated. Assuming these manuscripts were used frequently in early churches, viewed by the congregants and clergy, and copied to spread to other churches and cities, they probably were handled so often that they didn’t and couldn’t have lasted very long. Thus, it is likely they had to be copied rather quickly, and those copies would suffer from the same use and deterioration as the original. Yes, we have many copies and the gap is important, but not detrimental. However, to assert confidently that it is “well within the realm of possibility that we have in our museums today copies of the originals, full stop [emphasis his],” is misleading. Less than 1% of all manuscripts could even be considered within a few generations of the original, and most of the very early manuscripts are largely fragmentary. As such, his claim, even if remotely possible, is highly improbable and misleading.
- When Gilbert discusses the number of variants, he states, “One scholar has asserted there are, astonishingly, up to 400,000 variants in the New Testament! There are several things to say about this charge. First, the manuscripts are not in fact riddled with variants, and that 400,000 number isn’t nearly as scary as it seems, even if it’s accurate.” Multiple scholars have used that number, not just one, even though it seems as if he is setting up this unnamed “scholar” as a foil or antagonist. In fact, even conservatives hold this same number or even higher. Recently, Peter Gurry, a PhD student at Cambridge and a popular contributor to the Evangelical Textual Criticism Blog, presented a paper at the Evangelical Theological Society which was later published in New Testament Studies argued that 500,000 is probably a better estimate—and that number did not include spelling differences! Furthermore, Gilbert goes on to claim that the number is composed of not only Greek NT manuscripts, but also versional support (other languages), and quotations. This is simply not true; the most recent and well-researched variant estimates only include Greek manuscripts. It is true that most variants are inconsequential, and that the reason we have so many differences is because we have so many manuscripts to work with. As this blog’s author is noted for saying, we have an embarrassment of riches. Nevertheless, it does no good to minimize the number—to the point of falsely claiming, as Gilbert did, that there are only 16 variants per manuscript.
- Gilbert also gets his number of manuscripts wrong by stating we have only 5400 manuscripts, which is low by almost any standard. The current total of known extant MSS is 5839 (according to INTF). Even providing for lost, stolen, misnumbered, and destroyed manuscripts, the total would be around 5600 at minimum.
- The most egregious error is this: Gilbert perpetuates a common mistake in apologetic circles. He states, “Second, keep in mind that ‘400,000 variants’ here doesn’t mean 400,000 unique readings.” In fact, that is exactly what it means. He illustrates his point by claiming, “What it means is that if one manuscript says, ‘I am innocent of this man’s blood’ and ten others say, ‘I am innocent of this righteous blood,’ then you get to count all eleven as ‘variants.’” In reality, that would merely count as one variant. This is a common mistake beginning with the 1963 publication of Neil Lightfoot’s, How We Got the Bible, and perpetuated by other apologists. For a lengthy and very helpful discussion of the history of this mistake see the blog post by Daniel B. Wallace, “The Number of Textual Variants: An Evangelical Miscalculation.”
There is ample evidence to support the claim that the text of the New Testament is both reliable and stable. At the same time, we don’t need to appeal to false claims of bad counting or incorrect numbers to muster that evidence. These “Silly Statements” need to end if we are ever going to provide solid evidence for the reliability of the text.
On February 25–27, 2016, Houston Baptist University will be hosting a conference with the clever title, “Ad fontes, ad futura: Erasmus’ Bible and the Impact of Scripture.” This is HBU’s annual theology conference. The theme is related to the quincentennial of the publication of Desiderius Erasmus’s Novum Instrumentum Omne, which made its appearance on March 1, 1516. The timing of this conference couldn’t be better.
Herman Selderhuis, Craig A. Evans, Timothy George, and I will be delivering keynote addresses. Robert D. Marcello and Stratton Ladewig will be representing the Center for the Study of New Testament Manuscripts (csntm.org) at the conference, each giving a lecture. Rob’s paper is entitled “Significant Contributions to the Text of the New Testament and Early Church from the National Library of Greece,” while Stratton’s is “New Images Bring Greater Clarity: Examples of Improved Textual Identity in CSNTM’s 𝔓45 images.” John Soden and Greg Barnhill, two former students of mine, will also be giving lectures. Dan Pfeiffer, a current PhD student at Dallas Seminary, will be giving a lecture based on his work in Advanced New Testament Textual Criticism, a course he took from me last semester. Others delivering papers include Stanley Helton, Jeff Cate, Jeffrey Riddle, and David Ritsema. It looks like it will be a most stimulating conference! See the webpage on this event here.
Just a quick note that I will be speaking at Purdue University on Friday, Feb 5, at 7 pm on the topic, “How Badly Has the Bible Been Corrupted?” Here’s the link: http://www.symposiachristi.com
If you’re in the West Lafayette area this weekend, you might want to come to the conference. I’m speaking Friday night then giving two lectures on Saturday as well, followed up by a message at Covenant Church.