This short reflection was extracted from the Apple iBook, Digital Arts & Humanities: Scholarly Reflections, freely available at: http://itunes.apple.com/us/book/digital-arts-humanities-scholarly/id529097990?ls=1
Offering a short reflection on a topic that you are studying in great depth is a challenging task. Without a tangible research question, and the promise of an answer to that question – both of which act as something of a compass – you find yourself in what can only be described as a wilderness. Emerging from that wilderness isn’t the purpose of such a process, perhaps, but rather, the purpose is to document what went on during your travels.
The original draft of this essay sought to reflect on textual analysis, and from here, address its (oddly) estranged electronic cousin. This particular approach was abandoned when it became clear that the intended task was sufficiently difficult on its own without establishing an equally complex foundation. Instead, let’s just take a readymade definition, one that is widely accepted and has been offered by a respected scholar, as our pathway into this wilderness. There are many such definitions of course, but in this instance, I’m going to select one that I feel is sufficiently forthright in its construction: ‘When we perform textual analysis on a text, we make an educated guess at some of the most likely interpretations that might be made of that text’ (McKee 1). This is precisely what electronic textual analysis offers us, the potential to make an ‘educated guess’ as to what the ‘most likely interpretation’ of a text might be – via electronic means.
Socrates was once dismissed as nothing more than an immoral corrupter of youth, a description that might not be so readily applied today. Ideas that are perceived as radical can cause cultural upheavals that will draw both positive and negative reactions. The printing press was one such upheaval, as was the computer. In relation to the current topic, the latter is perhaps more interesting than the former, as it would appear that we have failed to, as McGann so aptly puts it in his 2004 essay on the state of the digital humanities, ‘emulate the humanists of the fifteenth century who were confronted with a similar upheaval of their materials, means, and modes of knowledge production’ (411). The dominance of print – culturally, socially, economically and academically – is evidence enough of how effectively our predecessors responded to previous cultural shifts in relation to how the text is perceived. Having reached the next juncture in the ‘long lineage of first contact narratives in media history’ (Liu), we again find radical ideas being received with mixed emotions. In spite of how his teachings were first received, Socrates is now held in high esteem, and in his place, as the corruptor of youth – or so it would seem – stands the digital humanist.
The processes used in textual interpretation have not remained as static as some would suggest. First there was the printing press, and now there is the electronic edition. This is not the history of textual studies. Both the printing press and the electronic edition are modes of production – production, and analysis of the cultural artefact produced, are very different matters. Textual analysis is about the application of a methodology to a text, and extracting from that text, as already noted, a meaning that can be sufficiently justified. Approaches to interpretation are numerous and, more importantly, remarkably protean. Where one scholar might apply a feminist reading to Frankenstein, another might dismiss such an approach. We are faced with such a multitude of ‘-isms’ that the task of literary interpretation has become a methodological minefield. But the purpose of traversing this minefield remains fruitful, and so when an additional approach does present itself, it is strange to see that some would step back in revulsion. Fish wrote in his (Web-based) column with The New York Times that the approach of the digital humanist is the ‘reverse’ of that which is practised by the traditional critic: ‘first you run the numbers, and then you see if they prompt an interpretive hypothesis,’ he claims. He goes on to state that this ‘method, if it can be called that, is dictated by the capability of the tool’, and once a pattern does emerge from the chosen tool, you do not know how to ‘proceed’ as you didn’t actually know ‘what you’re looking for’. I disagree with this argument on two points. Firstly, if we all knew what we were looking for before we sat down to interpret a text then it wouldn’t take years of close reading for so many a finding to emerge. Oftentimes, the particulars and deeper meaning in a literary text do not emerge as and when expected. Secondly, to say that electronic textual analysis relies on the extraction of data first, and the application of some ad-hoc interpretation second, is incorrect. There are countless examples of electronic textual scholarship where critics have approached a corpus with an idea of what it is that they were seeking to identify, and have simply used this latest approach as a method for confirming their beliefs.
Electronic textual analysis, like any critical method, has its drawbacks, but one of these is not that it is in fact a method (I will shed further clarity on this point as we progress). Digital humanities is only a method. This is a common argument, and one that I refute. Digital humanities is an emerging discipline, not simply a methodology within existing academic fields (an argument that is beyond the scope of this particular discourse I’m afraid), but it does have within it, the same as any discipline, a series of methods that are utilised throughout the research process. Dismissing electronic textual analysis, therefore, as simply a methodology, would be akin to criticising someone for using any strand of critical theory. Still, however, we have not yet reflected upon electronic textual analysis, but rather, examined its surprisingly controversial position within the academic world. Putting this aside, the question still remains: why conduct textual analysis electronically? And there you have it, the question that will point the way out of the wilderness.
Electronic textual analysis does not provide interpretation; it provides trends upon which the ‘most likely’ interpretations may be justified. I was fortunate enough to recently attend a workshop on R scripting with Stanford’s Dr Matt Jockers, from which an example of the benefits of such analysis can be drawn. Dr Jockers presented to us an electronic edition of Moby-Dick, and proceeded to inform us that his aim was to prove to us that there was a distinct pattern in the chronology and frequency of appearances made by both the whale and Captain Ahab throughout the narrative. It is beyond the scope of this reflection to delve into that pattern – nor is it mine to delve into – but rather, what is important is that to confirm this pattern, we had two choices. We could each have taken up our writing materials and scanned through the text line by line, keeping some record of both characters’ mention, after which we could have correlated our findings in some inexact way to present what would appear as an interpretation based on a highly subjective reading. Instead, we used R scripting to construct a frequency diagram from which Dr Jockers’ suggestion could be confirmed (and further expanded upon in the event of any other unforeseen trends). This is the potential of electronic textual analysis. It is not an approach designed to replace traditional literary criticism, but rather, supplement it.
The trouble with electronic textual analysis – like all interpretive practices it is not without its flaws – is that it requires specialist expertise. In addition, it requires reliable sources from which literary and textual critics can extract data – data that can be used to form meaning and both shape and justify interpretations. Many digital humanists are of the belief that digital humanities will one day be ‘just humanities’, but this will never come to pass unless both groups of scholars and practitioners agree to give something up (which they shouldn’t). One cannot expect all humanists to understand logic and programming, and by the same token, digital humanists should not be expected to halt their exploration of technology’s new avenues in an effort to re-think how we approach and answer age-old questions. The disciplines will remain separate because the people and processes of discovery will remain separate. That is not to say that the disciplines are not related, but they are not, nor will they ever be, the same. There will be the few who possess considerable expertise in both fields, but this will be the exception. Herein lies the first issue with textual analysis: generally, those who are interested in the study of literature are not familiar with the construction of scripts suited to textual analysis. There will always be ‘out-of-the-box’ solutions, but these are limited in that they cannot be adapted to meet a specific purpose without a familiarity with the language through which they have been developed. The flip side of this is that, those who are familiar with programming languages, are oftentimes overly analytic for interpretative assessment, or rather, are more concerned with more objective pursuits. Mastering one discipline is difficult enough – mastering two is for many unachievable, if not undesirable.
The first issue can be overcome through collaboration. However, the second issue – the provision of reliable sources – is perhaps more pressing in terms of literary analysis, and it is a weakness that is, for some reason, seen as something of a strength. Many scholars who have dipped their toes in the field of electronic textual analysis will tell you that it is liberating, liberating in the sense that it frees you from many of the typical restrictions presented in any traditional scholarly pursuit. Accessibility, they say, is one such liberating factor – studying the great texts is no longer reserved for those with access to the libraries in which they are housed, for digitisation and internetworking has made the study of text geographically dependent. The reality is anything but, and it was in fact easier for me to acquire a physical copy of Ulysses – the first edition facsimile being offered by Martino Publishing – than it was for me to prepare a digital edition of the text suited to electronic textual analysis. Accessibility, it would seem, has ironically remained with the print edition, and will remain so until some group with appropriate funding and expertise decides to provide a scholarly (perhaps TEI-compliant) Project Gutenberg. It is unlikely that any such project will emerge.
Fish, Stanley. “Mind Your P’s and B’s: The Digital Humanities and Interpretation.” Opinionator. Web. 9 May 2012.
Liu, Alan. “Imagining the New Media Encounter.” A Companion to Digital Literary Studies. 2008.
McGann, Jerome. “A Note on the Current State of Humanities Scholarship.” Critical Inquiry 30.2 (2004): 409-413. Print.
McKee, Alan. Textual Analysis: A Beginner’s Guide. London: SAGE Publications, 2003. Print.