Biblical Studies and Theology have generally been at the forefront of Digital Humanities, vying with the departments of English Literature and Modern History for leading digital research in new directions. The ETCBC is a prime example of this, with its flagship project of encoding the Hebrew Bible and including all sorts of annotations in a database, to enable researchers to let computers do deep searches in the structure (syntactic or semantic). Being at the forefront can have an odd effect, however, in that it can instill a desire to keep innovating to the technology’s maximum potential. This means that as computer technology progresses (and it has progressed in an absurd pace), the distance between ‘regular’ Humanities research and ‘top notch digital’ Humanities research has become huge.

Manuscripts, the actual evidence underlying any premodern historical research, have played an odd role in this. As soon as ultra high resolution photography, multispectral imaging, and carbon dating (to name the most important technologies), were available, they were applied in Biblical Studies, notably on the Dead Sea Scrolls. Further developments have been applied up until today, as witnessed by the research of Mladen Popovic’s team. However, there has not been that much innovation in this areas of engineering. And so, while there was a remarkable growth in the way computers could interact with plain text, text analysis became a favorite topic among those scholars who wanted to keep innovating in their digital methods. With the focus on plain text, manuscripts became ipso facto a lesser priority.

Other reasons factor into this as well, for which I would recommend you my forthcoming handbook on using digitized manuscripts (leave your e-mail address here to be notified). I wrote this handbook as I see a tidal wave coming that will make manuscripts again a focal point of digital research. This time, however, it will not be widening the gap between ‘Digital Humanists’ and others, but will (is, already) actually closing it. The digitization of manuscripts has scratched an itch of many scholars and students who simply want to access the manuscript. Perhaps their main motivation is that they have grown up in a digital world in which everything is on demand and so if they cannot get a PDF of a book, article, or indeed manuscript, they are hardly motivated to walk over to the library to see and read it for themselves. And so this engagement has spurred libraries and museums to digitize even more of their holdings.

Meanwhile it is already time to figure out how these different trends (advanced tools for textual analysis and digitization of manuscripts) can be brought together. This is no simple task as the two trends have opposite directions. The more advanced text analysis technology has become, the more it has focussed on a small corpus or even just one text. But, the more advanced digitization has become, the larger the corpus of text has become. However, while the challenge presented here seems formidable, its payoff is equally great, and we would therefore do well not to despair too soon. If we can bring the advanced text analysis already developed to the large corpus that digitization provides, entirely new avenues of research will open. It is for this reason mostly that I myself have found it very important to engage in conversations and collaborations with people for the ETCBC, to work together to come up with the next big thing.

Dr. Cornelis van Lit works on medieval Islamic philosophy and the application of computer technology to manuscripts. He is editor-in-chief of The Digital Orientalist.