Digital scholarship and cultural ideology

« previous post | next post »

Daniel Allington, Sarah Brouillette and David Golumbia, "Neoliberal Tools (and Archives): A Political History of Digital Humanities", Los Angeles Review of Books 5/1/2016:

Advocates position Digital Humanities as a corrective to the “traditional” and outmoded approaches to literary study that supposedly plague English departments. Like much of the rhetoric surrounding Silicon Valley today, this discourse sees technological innovation as an end in itself and equates the development of disruptive business models with political progress. Yet despite the aggressive promotion of Digital Humanities as a radical insurgency, its institutional success has for the most part involved the displacement of politically progressive humanities scholarship and activism in favor of the manufacture of digital tools and archives. Advocates characterize the development of such tools as revolutionary and claim that other literary scholars fail to see their political import due to fear or ignorance of technology. But the unparalleled level of material support that Digital Humanities has received suggests that its most significant contribution to academic politics may lie in its (perhaps unintentional) facilitation of the neoliberal takeover of the university.

Allington et al. give a plausible account of the history of computational text analysis in the humanities. Their narrative is oriented towards literary studies, without much discussion of fields like history, archeology and musicology; and there's room to argue about their choice of people and works to feature. But from my perspective outside the field, they have cause and effect reversed. Digital Humanities is not a top-down neo-liberal conspiracy aimed at a corporatist restructuring of literary studies. Rather, it's the natural and inevitable response of students and younger scholars to the opportunities afforded by new technologies, entirely comparable to the consequences of the invention of printing.

They write:

Neoliberal policies and institutions value academic work that produces findings immediately usable by industry and that produces graduates trained for the current requirements of the commercial workplace. In pursuit of these goals, the 21st-century university has restructured itself on the model of the corporate world, paying consultants lavish fees, employing miserably paid casual laborers, and constructing a vast new apparatus of bureaucratic control. The humanities are, in their traditional form, less amenable to such restructuring than other disciplines, relying on painstaking individual scholarship and producing forms of knowledge with less immediate economic application. By providing a model for humanities teaching and research that appears to overcome these perceived limitations, Digital Humanities has played a leading role in the corporatist restructuring of the humanities.

And they add:

What Digital Humanities is not about, despite its explicit claims, is the use of digital or quantitative methodologies to answer research questions in the humanities. It is, instead, about the promotion of project-based learning and lab-based research over reading and writing, the rebranding of insecure campus employment as an empowering “alt-ac” career choice, and the redefinition of technical expertise as a form (indeed, the superior form) of humanist knowledge. 

From a historical point of view, at least, this is simply false. People began using computers in humanities research pretty much as soon as computers existed, and they did this because they wanted to get their work done more easily. See Michael Preston's "A Brief History of Computer Concordances" for one set of traces:

Computer-assisted study of folklore and literature was initiated shortly after World War II by Roberto Busa, S.J., who began preparing a concordance to the works of Thomas Aquinas in 1948, and Bertrand Bronson, who made use of the technology to study the traditional tunes of the Child ballads. In the pre-computer era, Bronson had worked with punched cards which he manipulated with a mechanical sorter and a card-printer. Many early efforts at producing concordances by computer were modeled on the punched card/sorter/reader-printer process, itself an attempt at mechanizing the writing of slips by hand which were then manually sorted.

Or see the page "Computer Music (So Far)" at UCSC for a description of the early years of computer-generated music, from 1957 on, or the Wikipedia page on Computational Musicology, for similar stories. A 1991 article "Computers in Archeology at the University Museum" notes that

Computers have been used in archaeology for about 30 years [(myl) i.e. since the early 1960s]. For the first 20 of those they were used mainly for traditional database management and statistical analysis. In the l980s, with the advent of microcomputers, their application expanded. Today computers are used in every phase of archaeological projects to perform three major functions: data acquisition, analysis, and presentation.

None of this was in any way about a top-down "promotion of project-based learning and lab-based research over reading and writing, the rebranding of insecure campus employment […], and the redefinition of technical expertise as a form […] of humanist knowledge."

On the contrary,  it's clear that the successive waves of DH work over the past 70 years emerged bottom-up, as scholars took advantage of advances in technology to facilitate research they wanted to do anyhow. These technological steps have included the broad availability of batch processing in the 1950s; the development of minicomputers in the 1960s; early time-sharing in the 1970s; the availability of personal computers in the 1980s; the democratization of the internet in the 1990s; and so on.

There's nothing particularly humanistic about all this — researchers in nearly all fields have adopted successive waves of digital technology to let them do old things more efficiently, more accurately, faster and on a larger scale, and to do new things that were previously not feasible in practical terms.

My own first steps in computation came in 1966, when I had a part-time job that required using an old-fashioned sound spectrograph to measure durations in speech recordings. I hated the process — it was slow and the high-voltage spark used to burn the images into paper generated ozone that hurt to breathe. So when an early minicomputer showed up down the hall, with hardware for A-to-D conversion and a digitally-controlled oscillograph display, I jumped at the opportunity to learn to write an interactive program to let me make similar measurements a little bit more efficiently, and without the ozone. Nobody put pressure on me to do this — in fact my access to the machine was tolerated but not really encouraged, and generally took place between midnight and 8 a.m.

So what's all the fuss about?

One difference is that hostility to computational methods has always been more widespread among literary scholars than in other disciplines. (Though in general, even the most hostile critics don't generally insist on sticking with typewriters and snail mail…)  For an early ironic expression of this cultural divide, see David Lodge's exploration of stylometry in Small World.

Perhaps as a result, computer-based research in the humanities was professionally ghettoized from the beginning, especially in literary studies. The Association for Computers in the Humanities (ACH) was founded in 1978;  the journal Literary & Linguistic Computing began publication in 1986; the Text Encoding Initiative "was established in 1987 to develop, maintain, and promulgate hardware- and software-independent methods for encoding humanities data in electronic form". In contrast, we don't see organizations like the *Association for Computers in Physics, or the *Geology Encoding Initiative. (It's true that there's a cultural divide between biologists who work entirely in silico and those who work in vitro, much less in vivo, but my impression is that the inter-group hostility is relatively low, and I don't anticipate seeing an article about how Digital Biology is a conspiracy by neoliberal corporatist overlords.)

Whatever the history, the question of whether computational methods are at all relevant in literary studies, beyond the use of tools like word processing and networked communication, apparently remains controversial. And Allington et al. assert that

Digital Humanities is pushed far more strongly by university administrators than it is by scholars and students, who increasingly find themselves pressured to redirect their work toward Digital Humanities.

This might be true in recent years, though I'd like to see some evidence.

But my experience has long been exactly the opposite. Hundreds of times a year, I get inquiries from students — high school students, undergrads, graduate and professional students, from institutions all around the world — looking for advice and help in using digital methods to explore and analyze problems in various areas, many of them traditionally part of the humanities. Similar requests come from postdocs, faculty, and other older people.

I'm a computational linguist and phonetician at an Ivy League university. But I see similar testimony from people in very different circumstances. Thus Roopika Risam, "Digital Humanities in Other Contexts", 5/3/2016, writes

[O]ur undergraduate and graduate students are our only rationale for doing digital humanities. This is true for many of us who work outside the world of elite private or flagship state universities and small liberal arts colleges. As part of a regional comprehensive public university that grants master’s degrees and does not have a history of courting foundation support, we aren’t well-positioned for multi-million dollar grants to develop our digital humanities programs.  

We couldn’t be farther from the cartoonish fantasy of digital humanities that circulates in the clickbait du jour. Neither are most of our colleagues in higher education in the United States or around the world.

Again, there's nothing surprising here. The latest waves of technological development — bigger faster cheaper processing and storage, faster ubiquitous networking, growing archives of relevant text, audio, and video, and improved software — have increased both the opportunity and the demand for the application of digital methods in all areas of rational investigation, text analysis among them.

So the most important push towards "digital humanities" is bottom up from students and younger scholars, not top down from administrators and funders. This creates pressure for academic programs to offer relevant courses and models for research, and it puts some established  faculty in the position of commanding the tide to retreat.

It's certainly true that digital humanities students appreciate the job opportunities afforded by programming skills. And it's also true that the "digital humanities" label has become one of the shiny objects that university administrators and foundation executives are always looking for. But in my opinion, the forces behind increased use of digital techniques in the humanities are natural, inevitable, and generally benign. And to the extent that bad things are being done to academic departments in the humanities, the blame belongs mostly to developments internal to non-digital literary studies.

[h/t Bill Benzon, whose article "Golumbia Fails to Understand Chomsky, Computation, and Computational Linguistics", 7/16/2016, raises another set of issues that I plan to discuss in a later post.]

Update — As J.W. Brewer observes in the comments, there's a very helpful post by Ted Underwood, "Versions of disciplinary history", The Stone and the Shell 5/4/2016.

Update #2 — A key issue in the discussion below is whether "Digital Humanities" is just the use of computers in humanistic research, or rather a specific intellectual movement originating at UVa with specific and controversial ideological claims. I participated in a couple of committees involved with setting up a "Digital Humanities Center" here at Penn, based on the first idea, and never encountered the second idea at all. So I continue to be curious: are Allington et al. are wrong about what the term "Digital Humanities" is currently being used to mean, or was I was just oblivious to the intellectual politics involved? Reading the blurbs for the first 30 books advertised on amazon.com in response to a search for "digital humanities" pretty much echoes the experience I've had, which is that DH is a just way of referring to the use of computational methods in humanistic research. I don't see any of the sorts of totalizing ideological rhetoric that is ubiquitous (for example) in the work of Stanley Fish.

And there's an interesting series of four article by Patik Svensson from Umeå University in Sweden: "Humanities Computing as Digital Humanities", Digital Humanities Quarterly 2009; "The Landscape of Digital Humanities", DHQ 2010; "From Optical Fiber to Conceptual Cyberinfrastructure", DHQ 2011; "Envisioning the Digital Humanities", DHQ 2012. Svensson seems to be describing the world that I see (though I remain very much an outsider), and not the parallel (and rather different) universe that Allington et al. confront.



33 Comments

  1. leoboiko said,

    July 16, 2016 @ 9:18 am

    So what's all the fuss about?

    I think it's just a confusion about the direction of causality. Traditional humanities are losing prestige and social standing, not to mention funding; therefore students and institutions flock to technological methods and its supporting discourses. It's not the other way around. It's easy to be bitter about this, but that's just blaming the survivors.

    [(myl) There's some truth in this — but "students and institutions flock to technological methods" in biology and geology and astronomy and psychology and etc., where the fields in question are not "losing prestige and social standing".]

    My university is the largest in the country: the Humanities building is a decaying fire hazard, pieces of the ceiling literally falling, while the Economics dept across the street has air-conditioning and automatic glass doors (and let's not get started on the kind of funding Engineering gets). The nascent PhD program of my field (the first in the country) was axed. Certain lit/lang departments were reduced to a single professor; others are expected to be in a couple years; the hiring of new teachers is halted, possibly forever. (The last remaining Classical Japanese teacher has retired; I asked my adviser whether it would be possible for me to get a job in her place; "I find it much more likely that our entire course will be closed.") Deans talk ominously about getting rid of degrees with "little market interest" (echoing American notions of college courses as career investments, even though ours are tuition-free). Out of 400+ research candidates for a certain scholarship last year, a single one was chosen. I took a one-week palaeography crash-course in the History dept; not only there wasn't institutional support for a full study, but the professor kept lamenting about how palaeography departments were being closed all over, even renowned European ones. Enthusiastic science professors managed to erect an interdisciplinary degree, encompassing comp-sci, math, chemistry, physics, and biology; a similar proposal for a pan-humanities course was refused before it even had a chance; even a timid proposal to get, say, students of Spanish literature into Iberian history and geography and philosophy was nuked. Et cetera.

    [(myl) This picture is unfortunate and in the end probably counter-productive. But I don't think that the overall state of the humanities is nearly this dire everywhere. And taking the long view, there was a time when philosophy and classics had the prestige, and engineering existed only in special second-class institutions, if anywhere. Then as now, the key determining factor in the end was the perceived role of various disciplines in forming national leadership and institutional prestige.]

    I came from a computer science background, and the difference in prestige is enormous and sad ("I'm a computer scientist" / "I study classical literature"). Whenever you talk of computer analysis and statistics, you get starry-eyed admiration. Whenever you talk of methods of literary analysis, you get the same dumb jokes about "postmodernism", Sokal, and flipping burgers. Geisteswissenschaften are out, Naturwissenschaften are in; polemicists who haven't even bothered to read what they're criticize wave a couple fMRI scans and proclaim the entire field of philosophy dead; if you feel "the responsibility of intellectuals" and try to even mention the disenfranchised, you'll be metaphorically stoned as a "SJW" and a "regressive leftie" amidst denunciations of "campus outrage culture"; and if you say you aren't interested in objective facts but in what used to be known as "the Human Spirit", you risk being laughed straight out of a job.

    [(myl) Some (arguably rather old-fashioned) approaches to literary analysis retain considerable prestige and influence. The fMRI-wavers have been taking their lumps recently, and it's not the first time. And Sokal and other post-Modernism jokes are often inappropriate, but it has to be confessed that at the level of the field of literary "theory" as a whole, these are self-inflicted wounds.]

    I disagree that "restructuring", an euphemism if there ever was one, is to be blamed on "developments internal to non-digital literary studies". Rather, they're the outcome of trends external not only to literary studies, but to the University as such; changing notions of what an university even is, and whether pure scholars should be supported at all, and if so, of what kind. I agree there's no top-down pressure to embrace digital humanities; the whole thing is much more organic, systemic, and faceless. It's just that the prophecies about Castalia (The Glass Bead Game) are coming to fruition.

    [(myl) I agree with you that there are important external forces at work, to a large extent driven by demographics. The last half of the 20th century saw an enormous expansion of higher education, partly in response to the post-WWII baby boom, and partly due to a major increase in the proportion of the population going beyond high school. That expansion made it easy to add new fields while at the same time expanding traditional fields. The expansion is over, and some contraction appears to be setting in, which creates new stresses and strains. Meanwhile, there have been major changes in how various humanities disciplines see themselves and their relationship to society, and simultaneously in how society sees those disciplines and their role in preparing students for various social roles.

    It would be a big mistake to see the university of 150 years ago, or 100 years ago, or 50 years ago, as a sort of utopian spiritual paradise devoted to the pure pursuit of eternal intellectual values.]

  2. Bill Benzon said,

    July 16, 2016 @ 9:18 am

    Thanks for this, Mark. It is an important and useful contribution to the discussion. FWIW, as far as I can tell, the response to Allington et. al. has been mostly skeptical to negative.

  3. peterv said,

    July 16, 2016 @ 9:18 am

    "(Though in general, even the most hostile critics don't generally insist on sticking with typewriters and snail mail…)"

    As recently as this century, I sent an email to philosopher Susan Haack, and received a typewritten letter in reply that began, "I don't do email."

    Yet, manifestly she did do typewriting, and manufactured envelopes, and postage stamps, and airmail stickers, and moisture-based adhesives, and postal messaging systems, and international agreements on postal deliveries between nations, and aircraft carrying international post, and even written language. I wondered at what age technologies were old enough to become acceptable.

  4. leoboiko said,

    July 16, 2016 @ 9:37 am

    @peterv: For certain people, refusing to use certain tools is just a way of focusing on what matters to them – of avoiding distractions. Free software pioneer Richard Stallman doesn't use a web browser; he has a daemon (a background service) which fetches webpages he's interested in, and sends them as text-based email messages to himself. Famous computer scientist Dijkstra preferred to write by hand (he left us 1318 such notes).

  5. David Golumbia said,

    July 16, 2016 @ 9:42 am

    I appreciate the civil engagement, Mark, which unlike the snippets I've read of Benzon's piece and the hostile responses of many others, which I won't engage with.

    But I think you make the same mistake that the LARB responders made, one that is in part caused by editing the piece from 10,000 words to 6000 at the editors' request, but which I nevertheless take responsibility for: we should have been clearer, although I've published on this topic a lot and have made my position very clear.

    We absolutely do NOT oppose digital methods in scholarship. We endorse them. We like them. I have been in print many times praising them, particularly in linguistics. I have engaged in many projects and hope I will again (my current school has very few resources for computational projects, among other things). Daniel A currently engages in them. I am particularly a fan of WALS and of the BYU corpus material, but I have gained a huge amount from all kinds of corpus analyses and much else.

    Our view is that the transition from "Humanities Computing" to "Digital Humanities" was ideological in nature: it was about changing what a field *meant*, not what it *does.* That is why I have continually suggested a contrast between Computational Linguistics and DH: DH (and one has to read the texts to see this) is full of–arguably is characterized by–a thorough and studied rejection of everything literary scholars do, from the models and theories we use, to the object of study, to the way we are employed. I've spent enough time among linguists of various sorts to see that this divide simply doesn't exist there (if anything, it more closely parallels the divide between Chomskyans and non-Chomskyans, but I'll try to avoid that hornet's nest). Computational linguists don't–at least in the main–go around saying that linguists shouldn't closely study languages, that they shouldn't be professors, that knowing languages is "outmoded and traditional," and so on. But this is exactly what DH routinely says about English scholars and our objects of study.

    Again, to reiterate something I've said many times in print: I *support* digital methods in the humanities. In fact I like them in linguistics more than in literature, because their utility there is far clearer, but that's beside the point. My view–and the view of Daniel A and Sarah B–is that despite surface appearances, DH is far less "about" digital methods than it is about a political rejection of much of what English scholars do.

    I'll add one more note of personal experience: I'm not only making this diagnosis based on reading the texts and public discourse associated with DH: I'm also basing it on many conversations I had when I was employed at UVa where the DH movement (as opposed to humanities computing) was born, and on many reports from, non-DH English scholars, administrators, and even some dissident DH scholars, among whom the impression of DH entailing severe hostility to the "rest of English" is widespread. It is this hostility, and its uptake by administrators, that I oppose, not digital methods.

  6. David Golumbia said,

    July 16, 2016 @ 10:16 am

    I'll add one more comment: "the most important push towards 'digital humanities' is bottom up from students and younger scholars, not top down from administrators and funders." I think this statement requires exactly the kind of unavailable empirical evidence we will never get.

    [(myl) I meant this not as a claim about survey statistics, but rather as an inevitable historical projection. It didn't take a survey in 1500 to predict that the future of the humanities would involve printed books more than hand-written copies; and we don't need a survey today to predict that the future of the humanities will involve the computational methods that you yourself (in the earlier comment) support.]

    Further, it presumes that "students and younger scholars" understand what they are signing up for when they sign up for DH, and I've seen plenty of (anecdotal) evidence that they don't, and later on have a rude awakening when they get to the hostile part, which is pronounced and extensive. In fact, the DH scholar you cite there has herself engaged in remarkable public hostility toward other scholars and former friends who dare to note that DH plays a destructive institutional role.

    For reference, read the LARB interview (https://lareviewofbooks.org/article/digital-humanities-interview-laura-mandell/) with long-time DHer Laura Mandell where she says, without even appearing to realize what she's saying, and quoting another long-time DHer, Julia Flanders: "We don't want to save the traditional humanities," where the only way to parse "traditional humanities" is as "anything that's not DH." I don't think she–or many other people in DH–understand how that sounds outside the DH club, including how we can have gotten to the point where humanities scholars can openly call in public for the elimination of entire disciplines.

    [(myl) This kind of sectarianism is more common in academia than you suggest. In psychology, calls for the end of Freudian analysis as unscientific nonsense have been around for decades. In the mid-1960s, "generative" linguists argued that their structuralist predecessors should shut up shop and leave the field (see an anecdote here); various sects of economists variously argue that their enemies should similarly pass from the intellectual scene; the fights in physics for and against "string theory" have been going on for a long time, with very similar calls for subfields to be terminated; and so on.

    I'm not convinced that the situation in the humanities (digital or otherwise) is as bad, though I admit that my experience is limited. I've played a minor role on several committees here at Penn associated with setting up the new Price DH Lab, and I've (spottily) attended a series of meetings on digital text analysis organized by some people at the university library — and in these dozens of meetings of these various groups, I've never encountered even a hint of the hostility to traditional humanistic scholarship that you describe.]

  7. elessorn said,

    July 16, 2016 @ 10:53 am

    Computational linguists don't–at least in the main–go around saying that linguists shouldn't closely study languages, that they shouldn't be professors, that knowing languages is "outmoded and traditional," and so on. But this is exactly what DH routinely says about English scholars and our objects of study.

    I like this example. You can digitize spectrography, you can digitize texts and even automate the generation of, say, concordances. You can harness digital technology to achieve significant efficiency gains in tasks that once involved a lot of rote, repetitive labor (searching through reference works, vocabulary look-up in second-language education, etc.). But there are some forms of research that can't be automated, scaled, or easily quantified. Mastering a foreign language is one, close reading of literary texts is another. I don't want to put words into David Golumbia's mouth, but I sense that the proper analogue in Computational Linguistics of DH research methods as currently promoted might not resemble the spectrography example, or the archaeology example at all.

    I imagine a comparative computational study of, say, pronoun-dropping in modern Romance languages by researchers who hadn't mastered even one of them, but had produced a massive database of words tagged for parts of speech, etc., allowing some kind of result to be produced. Or an archaeological survey run by people with a very cursory knowledge of the history of the area under study, but which does return reams of data. Absurd as these examples may seem, they seem comparable to text-processing-based studies of, say, Thomas Hardy that make claims about his oeuvre without reading any or many of his works. Such studies might not produce much worthwhile insight, but they do necessarily produce results. And who will review such results, but someone who has read the collected works of Thomas Hardy, a very analog task that might produce very few results over long years? And if DH projects are the way to get hired, how long will the field have people like that left?

    If the question is, what's all the fuss about? Does this make sense as an answer? The privileging of result-production in literary studies at a pace that only computational tools can provide, has the potential not only to disincentivize the core competencies of literary study, but to disincentivize in general research questions that cannot be digitized. When you have a hammer…

    [(myl) My impression is that specialists in close reading of Thomas Hardy are just about as endangered as Leo Boiko's palaeographers are — and not because of the depredations of Digital Humanists, but rather because the attentions of English-department members have turned elsewhere. In general, the practice of close reading seems more threatened by the residue of literary "theory" than by the possibility of statistical text analysis.

    Compare biological fieldwork, and animal research, and test-tube research, and research in genomic and proteomic databases or with population-genetic models — these are culturally distinct, sometimes in competition for resources, and often just a bit hostile to one another. And things have changed a lot since Darwin's day, when "botanizing" was a fashionable occupation for curious gentlemen — but the introduction of new methods hasn't obliterated the old ones, despite periodic complaints and predictions to the contrary.

    In most fields and subfields — "scientific" as well as "humanistic" — there's a tension between insight and persuasion. "Big Data" approaches lend themselves well to evaluating hypotheses. And it's also possible to use large datasets in an exploratory mode to generate hypotheses to test, but there's a crucial step of informed judgment about which hypotheses are interesting, and in what way.

    And there are certainly domains where there are no facts, or where facts are not relevant. But pace Stanley Fish, the humanities were never traditionally such an area. ]

  8. leoboiko said,

    July 16, 2016 @ 11:07 am

    @elessorn I've argued before that formalist, objective analysis of what's demonstrably in the text (which includes the quantitative/statistical methods of computer analysis) can and should be a useful basis onto which to build subjective, humanities-style criticism. It's just that it can't replace it. See comments here, apropos of another Liberman post about Hemingway's sentences not being actually as short as they're claimed to be.

  9. J.W. Brewer said,

    July 16, 2016 @ 2:21 pm

    Two and a half points:

    1. I agree that early forms of computer-aided "Big Data" were embraced from very early on by traditional humanistic scholars. The tweedy old fellow (long since deceased) who taught me Homeric Greek 30 years ago was very excited (either that year or the following one) when the university acquired a searchable CD-ROM of pretty much the entire surviving corpus of classical Greek literature (I think maybe coverage was more selective once you got into the "Hellenistic" era) because it enabled him to look for meaningful patterns and similarities in the usage of particular turns of phrase which he might never have otherwise come across. And of course the production of concordances and similar old-economy analogues to searchable text databases is an old standby of humanities scholarship. Indeed an assignment everyone in my freshman English section had (this would have been spring 84) was to consult a concordance of every-single-word in Joyce's Ulysses, pick a word that appeared at least X but no more than Y times, and write about what Joyce did with it in the various contexts in which it appeared. (And that was the grad student teacher who is now a tenured English professor; not the one from the other semester who tired of literature and then got a second Ph.D. in computer science!) The concordance didn't help us with bigrams or other collocations, which could have made for a more interested project.

    2. One parallel the rise of Big Data has had within the linguistics profession is some increasing hostility to the old style of basing claims about syntax etc. (whether in a Chomskyan framework or otherwise) based on contrasts between well-formed and ill-formed sentences as determined solely by the academic writer's subjective introspection and personal grammaticality judgments. Because no one has to take the tenured author's own subjective claims at face value – it is much easier to verify them than it used to be.

    2b. To the extent (pace myl's pace-ing of Fish) literary scholarship has over the last several generations had more than its fair share of non-falsifiable BS, Big-Data driven methods actually are a threat to that, because some such BS becomes falsifiable in a way that was simply not possible in earlier years. Clearing out the underbrush of obviously false claims (and/or undermining the reputation and authority of frauds and bullshitters) will not, by itself, cause new scholarly work making claims that are both true and interesting to spring up, but it may be a useful first step.

    More positively, it can help serious scholars with traditional interests who don't WANT to be bullshitters but have hitherto lacked good means to fact-check their own unreliable subjective impressions. Semi-random example — let's say you're re-reading a particular poem by Keats or Dickinson or whoever and you are struck by a particular image or turn of phrase that after you meditate upon a bit more seems totally revolutionary — a wonderful synecdoche for how a whole revolution in perception and sensibility was underway in the 19th century that prefigured something that became a commonplace in modernity. You could get a really good article out of this! Maybe you can even get published in NYRB or someplace like that that doesn't have only 400 subscribers, all of which are research libraries. Well, how novel really was it? One of the things corpus linguistics has established is that even people who are unusually interested in language can fall prey to recency illusions and the like, and maybe it turns out that the particular turn of phrase was not innovated by Keats, even if he used it well, but had been floating around for a century previous in not-very-high-quality texts (hymns not good enough to be anthologized in later generations, chapbooks agitating about tax reform, broadside ballads about highwaymen being hanged at Tyburn) that the particular scholar had never had occasion to be immersed in. Having the technical ability to check that posssibility before you write an entire paper based on an empirical mistake about how innovative the phrase actually was seems like it could be very valuable.

  10. Bill Benzon said,

    July 16, 2016 @ 2:27 pm

    The Los Angeles Review of Books has been giving the digital humanities quite a lot of attention recently. In addition to Allington, Brouillette and Golumbia, which Mark is responding to, there is a response, Juliana Spahr, Richard So, Andrew Piper, Beyond Resistance: Towards a Future History of Digital Humanities, and an excellent series of interviews conducted by Melissa Dinsman, The Digital in the Humanities. At the moment there are nine interviews in the series, and I recommend them all. One of them is with David Golumbia. That interview contains a link to an online conversation that took place three years ago, Open Thread: The Digital Humanities as a Historical “Refuge” From Race/Class/Gender/Sexuality/Disability?

    As the title suggests, it is "pitched" at the issue the concerns Golumbia. The conversation runs for 166 comments over three days and is excellent. A lot of voices are heard from, including Golumbia, and many of them young, but not me, as I wasn't aware of the discussion at the time. It is worth reading in full if you want to get a feel for this particular discussion.

    The passage that most resonated with me was by a poet, Chuck Rybak, who quoted Marjorie Perloff I (a senior critic of modernism) and then observes:

    Since I’m a creative writer and lit prof who teaches a lot of poetry, Perloff means a lot to me as a critic. My sense is that Perloff would reject the word “refuge” and replace it with something like “return.” But a return to what? Simply, a focus on poetics/form/rhetoric. When I first started dabbling in DH work, I was immediately struck by how text-centered the enterprise is, and that has proven very useful pedagogically, especially when working with an undergraduate population who often prefer to flee the text as quickly as possible and get right to ideas in the abstract. In short, I’m sympathetic to Perloff here because I think it approaches this question in terms of embracing an interest rather than primarily rejecting something else. Perloff, in that essay, gives respect to cultural readings of works like Ulysses and Heart of Darkness, especially as they relate to empire, etc. Still, what Martha Smith might describe as a refuge (or seemingly so), I hear someone like Perloff saying what’s needed is a return to poetics.

    What I find particularly intersting is that this comment got no response in that discussion.

    I posted an oblique response to Allington, Brouillette and Golumbia at New Savanna: What’s in a Name? – “Digital Humanities” [#DH] and “Computational Linguistics”. First I quote from a series of essays by Matthew Kirschenbaum in which he talks about the origins of the term, "Digital Humanities," and the uses to which it has been put. Then, following an account by Martin Kay (a first generation computational linguist), I tell the story of how "computational linguistics" was coined. Briefly, it was coined as a pre-emptive measure to separate the computational investigation of language from the term "mechanical translation," as it was clear to leaders in the field that machine translation was about to lose its government funding. It was the practical objective of machine translation that gave birth to the computational study of language in the early 1950s. The enterprise was defunded in the late 1960s because practical results were not forthcoming.

  11. Daniel Allington said,

    July 16, 2016 @ 4:02 pm

    You write:

    [Allington et al] write:

    What Digital Humanities is not about, despite its explicit claims, is the use of digital or quantitative methodologies to answer research questions in the humanities. It is, instead, about the promotion of project-based learning and lab-based research over reading and writing, the rebranding of insecure campus employment as an empowering “alt-ac” career choice, and the redefinition of technical expertise as a form (indeed, the superior form) of humanist knowledge.

    From a historical point of view, at least, this is simply false. People began using computers in humanities research pretty much as soon as computers existed, and they did this because they wanted to get their work done more easily. See Michael Preston's "A Brief History of Computer Concordances" for one set of traces:

    Computer-assisted study of folklore and literature was initiated shortly after World War II by Roberto Busa, S.J., who began preparing a concordance to the works of Thomas Aquinas in 1948…

    However, the claim that you identify as 'simply false' arises from a demonstrable misunderstanding of David's, Sarah's, and my article, which makes a distinction between computer use in the humanities (which has been going on for a long time) and Digital Humanities (which we identify as a movement that took shape throughout the 1990s and that adopted the 'Digital Humanities' name at some point between 1999 and 2001). Had you read more of our article than just the introduction, you would have come across the paragraph that begins as follows:

    Computer use in the humanities of course predates the formal movement that calls itself Digital Humanities. The trailblazer is usually identified as a Jesuit priest, Roberto Busa, whose 56-volume concordance to the works of St. Thomas Aquinas was produced over a period of three decades from 1949…

    Sounds familiar? I suppose your mistake is understandable, given that you are (by your own admission) an outsider to the field and (evidently) could only be bothered to read the first few paragraphs of the article to which you respond.

    [(myl) In fact I read the whole article, including the reference to F. Busa. But your article never states what you apparently believe, which is that when taking up the new name "Digital Humanities", an innocent and even laudable use of computers in the humanities became a tool of neoliberal corporatists. My own contacts with such researchers started in the mid-1970s with Roberto Busa's student Antonio Zampolli, and continued in the 1980s with corpus-based lexicographers at OUP and Harper-Collins, and as far as I can see, that culture has continued seamlessly through to the work of the Digital Humanists that I encounter today. So it frankly never occurred to me that you meant to praise, or at least to tolerate, the computer-using humanists of 20 to 50 years ago, while strongly condemning their more recent counterparts.]

  12. AntC said,

    July 16, 2016 @ 10:37 pm

    From a historical point of view, … People began using computers in humanities research pretty much as soon as computers existed,

    Quite. Actually from just before computers existed ;-) Bletchley Park's Colossus — arguably the first programmable electronic computer — was used specifically for natural language processing (cracking German signals).

  13. Gwen Katz said,

    July 17, 2016 @ 2:42 am

    The Small World example is so funny to me as an author because Dempsey is giving Persse a valid and useful critique of his work. Persse overuses descriptors until they become cliche; he genders his language in sleazy, sexist ways. Yet simply pointing out to him what he wrote is enough to make him fall onto his fainting couch and refuse to write again.

    One wonders if he would have overreacted the same way if someone without a computer had said "You sure use the word 'grease' a lot." The only difference I see is that the latter claim has deniability, which suggests that the real (perceived) problem here is that hegemonic voices like the successful mid-20th century white male author can actually be evaluated instead of simply accepted as geniuses.

    Nowadays it's common practice for an author to use computational tools to discover that, say, her protagonist smirks at least once per page, or that multiple things get bisected when that word should probably be limited to one use per book. Then, of course, she revises, rather than going into a funk over the fact that her writing has room for improvement.

  14. Daniel Allington said,

    July 17, 2016 @ 4:27 am

    My own contacts with such researchers… continued in the 1980s with corpus-based lexicographers at OUP and Harper-Collins, and as far as I can see, that culture has continued seamlessly through to the work of the Digital Humanists that I encounter today.

    I presume you mean the same corpus-based lexicographers whom Sarah, David, and I refer to here:

    In the early 1980s, computing became centrally important to the discipline of lexicography, thanks to work done by academic linguists at the University of Birmingham and professional lexicographers at Oxford University Press. The former employed statistical analysis of large volumes of text to study contemporary word usage, contributing to dictionaries and English language teaching materials published by Collins, the financial backer for much of their work.

    (Note that we say simply 'Collins' because the merger post-dated the beginning of the project by a few years.) As to whether 'that culture has continued seamlessly through to the work of the Digital Humanists', it would seem more intuitive to say that the work of corpus linguists in the 1980s has continued seamlessly through to the work of corpus linguists in the 21st century. Corpus linguists do not typically call themselves 'Digital Humanists' but 'linguists'. As our article makes clear, the origins of the Digital Humanities movement are not in corpus linguistics. Many Digital Humanists teach students to use tools originally developed for corpus linguistics (especially AntConc, as it's the most user-friendly), but typically without teaching anything that would resemble corpus linguistics as a corpus linguist would recognise it. (Note that I used to teach corpus linguistics in a linguistics department.) There are exceptions.

    But your article never states what you apparently believe, which is that when taking up the new name "Digital Humanities", an innocent and even laudable use of computers in the humanities became a tool of neoliberal corporatists.

    That's a simplification of what we argue, because computer use in the humanities and the Digital Humanities movement are two different things. In the humanities today, computer use is virtually universal – at least in the developed world. Some humanities scholars use computers more intensively than others. Some of the more intensively computer-using humanities scholars call themselves Digital Humanists. Some self-proclaimed Digital Humanists are centrally part of the movement. Our article is about the movement.

    If you read our article on the assumption that when we say 'Digital Humanities', we refer to 'all computer use in the humanities', then you won't be able to make much sense of it. However, we repeatedly distinguish the two, and frequently use the phrase 'the Digital Humanities social movement' in order to emphasise that our target is not computer use but a social movement within the humanities.

    So it frankly never occurred to me that you meant to praise, or at least to tolerate, the computer-using humanists of 20 to 50 years ago, while strongly condemning their more recent counterparts.

    The issue is not the computer-using humanists of 20 to 50 years ago versus the computer-using humanists of today. It's the characteristics of the Digital Humanities movement (which, as noted above, only encompasses a small minority of computer-using humanists). We argue that the movement is set up in such a way as to forgive or even to encourage poorly theorised tool use. In our article, we make that argument several times, for example in the following:

    While some scholars affiliated with the Digital Humanities movement became conversant with the intellectual background of the procedures they employed, producing valuable work that could have seen publication in the venues within which the procedures themselves had been developed, the fetishizing of code and data and the relative neglect of critical discourse within Digital Humanities have led to the emergence of an environment in which one is more likely to encounter oddities such as Michael Dalvean’s recent claim that the probability scores yielded by a machine learning algorithm are an “objective” measure of literary value.

    (I feel a bit sorry for Dalvean, actually. A decent editor would have said to him: 'The procedure you have developed is interesting, but you can't claim that it provides an objective measure of literary value; that makes no sense. Cut that out and I'll publish the rest.' The fact that this didn't happen reflects much more negatively on the journal than on the author. But it's the oldest and most respected Digital Humanities journal.)

    The political part of our argument relates to the consequences of promoting uncritical tool use in this way. We don't say that all computer use is bad. We don't even say that all Digital Humanists do bad work – as the above quote makes clear (and it's not the only thing I could quote from the article to make this point), we admit that there are exceptions.

  15. James Wimberley said,

    July 17, 2016 @ 5:40 am

    Mary Renault's historical novel "The Praise Singer", whose protagonist is the real Ancient Greek poet is an Simonides, makes a point of the effect on the previously entirely oral bardic trade of another disruptive innovation, writing. IIRC Renault's Simonides adopts it, unlike traditionalist rivals, but is still disturbed when a young acolyte uses writing to compose poems.

  16. Ted Underwood said,

    July 17, 2016 @ 8:30 am

    The defense that Allington and Golumbia offer above is, that they're not talking about "computer use" or "computational methods" in the humanities — but about a "Digital Humanities movement" that post-dates those things, begins in the 1990s, and has a very distinct social logic.

    Is it true that "DH" is different from computers-in-the-humanities? I think it's a hard question; there may not be a single true answer out there. It's a question about how people understand their own practice, and you might get different answers from different people. I don't use "digital humanities" to describe my own practice — precisely because people use the phrase in so many broad, slippery, and incompatible ways.

    But the difficulty of answering the question is itself telling. If people don't know whether "digital humanities" means computers in the humanities, or a distinct social movement — then it can't be a *very* distinct social movement. Historians who look at the last twenty years will want to isolate the phrase, and ask what it meant, but a candid historian will have to admit that it meant something with very blurry edges.

    I also think it's telling, for instance, that Mark Liberman is involved in conversations around the Price Lab for Digital Humanities. That suggests the people involved in building a DH lab at Penn understand corpus linguistics as an important part of the spectrum of approaches covered by "digital humanities." They don't seem to be using the phrase in order to affiliate with a narrowly-defined intellectual tradition. In other words, whatever "digital humanities" may have meant in 2004 or 2009, it might be a mistake to assume that it must always and forever mean the same thing.

  17. JS said,

    July 17, 2016 @ 10:47 am

    Um… so before came the cultural conservatives, clinging zealously to "favored texts and artifacts of privileged groups"; after came the neoliberal corporatists, cheering lustily for "findings immediately useable by industry." In between, the postmodern literary theorists—political progressives, champions of the downtrodden, of diversity, of inclusivity. Srsly?

    And the thing is… I always despised "Digital Humanities." Meaningless adminese. Later I realized I did it. Not a flag I'd ever fly, "DH". But come on guys! You were Hegemons in your day. I know, you don't notice when you're IN the castle. But remember how you'd say Those Really Aren't Current Methodologies, and No I Can't Write That Letter, and Maybe You Should Talk To The Linguists?

    And really, looking at the latest "DH" literature, this is not over by a long shot—could well be like China, where they storm your castle but marry your women and wind up speaking your language. Do not despair! Trust your Methodologies! Problematize! Every article like this one is a step in the right direction!

    Jonathan Smith

  18. Bill Benzon said,

    July 17, 2016 @ 11:03 am

    1) Picking up from Ted Underwood, though I'd known about the importance of UVa for some time, I wasn't aware of the specific association of "digital humanities" and UVa until the LARB article. I'd known about humanities computing back in the mid-1970s when David Hays and I reviewed the computational linguistics literature for Computers and the Humanities. But I didn't pay much attention to that literature because it wasn't germane to my own work, which was centered on natural language semantics of the kind developed in the cognitive sciences in the 1970s and 80s.

    It wasn't until 2006 that I reconnected. At that time I was posting at a group blog, The Valve, and we held a symposium on Moretti's Graphs, Maps, Trees. I don't recall the term being used those discussions, though, for example, Matt Kirschenbaum was one of the contributors: Poetry, Patterns, and Provocation. Looking back through my blog I see that the oldest post tagged "digital humanities" dates to July 7, 2011: Digital Humanities Sandbox Goes to the Congo. But it wasn't until the middle of 2013 that I began giving sustained attention to DH and it was topic modeling that drew me in. As far as I know topic modeling has no specific association with UVa. In March and April of 2014 I did a series of posts on Alan Liu (looking at some interviews and his PMLA piece on meaning in DH). At that time, I believe, I had some private correspondence with him in which I suggested that the term "digital humanities" was so diffuse that it had little value. & it turns out that many in the field are of that opinion.

    2) As for Laura Mandell recalling Julia Flanders as saying "We don't want to save the traditional humanities" at a conference, I have no way of knowing the intended scope of "traditional humanities", but comments in response to conference presentations can be pretty informal and, as Mark has noted, sectarian strife is common in academia. Post structuralist ideas, after all, were not welcomed with open arms. I'd be very surprised if there weren't some digital humanists, neoliberal or not, who want to reject some (significant) portion of existing practice, whether in literary criticism, history, art history, musicology, archaeology, or whatever else. That's just how the academy is.

    In any event I'd be surprised if Mandell/Flanders are as thorough-going in their rejection of "traditional humanities" as Joseph Carroll, the literary Darwinist, has been in various interviews and formal publications. Back in 2010 in New Literary History, for example, he sketched out three possible futures for literary Darwinism. In one scenario it continues in its current status, as a relatively isolated enclave. In the second it joins the party along side other critical approaches and has its work included in the standard casebooks, anthologies, reference works, and its articles accepted in the standard journals. The third scenario is one of thoroughgoing revolution:

    The Darwinian literary study that, in this scenario, will ultimately absorb and supplant every other form of literary study will assimilate all the existing concepts in literary study—traditional concepts of style,genre, tone, point of view, and formal organization, substantive concepts of depth psychology, social confict, gender roles, family organization,and interaction with the natural world.

    Whatever it is that's going on in the humanities these days, it's not just (neoliberal) DH or the rest. There's a whole lotta' shaking going on across the board and among a wide range of disciplines in the human sciences.

  19. J.W. Brewer said,

    July 17, 2016 @ 2:59 pm

    Professor Underwood modestly failed to link to his own lengthier prior response to the Allington et al. piece, which may be worth reading for those interested in this thread: https://tedunderwood.com/2016/05/04/versions-of-disciplinary-history/

  20. Daniel Allington said,

    July 17, 2016 @ 5:48 pm

    I only spoke up here in order to point out the original posting's misreading of the article that it responded to. But Ted Underwood makes a point that is worth a reply:

    I don't use "digital humanities" to describe my own practice — precisely because people use the phrase in so many broad, slippery, and incompatible ways.

    If people don't know whether "digital humanities" means computers in the humanities, or a distinct social movement — then it can't be a *very* distinct social movement. Historians who look at the last twenty years will want to isolate the phrase, and ask what it meant, but a candid historian will have to admit that it meant something with very blurry edges.

    Actually, it's because the term 'Digital Humanities' has no useful definition that I insist on calling Digital Humanities a social movement. All you can point to are social phenomena: a group of people who refer to themselves as 'Digital Humanists', plus the various institutions with which they are associated (the NEH office of DH, the annual DH conference, the various DH journals and DH centres, etc). This makes Digital Humanities very different from, say, literary Darwinism, which has been mentioned above: yes, one can point to a group of literary Darwinists, but they are defined by being the people who do literary studies in a way that itself has a fairly clear definition (sure, there are some important differences between them, but you know what I mean). With Digital Humanities, it's the other way around: there's a group of Digital Humanists, and Digital Humanities is whatever those people happen to be doing.

    So yes, of course it has blurry edges. But the institutions are real.

    About the Underwood piece that JW Brewer links to: I hadn't read that. There are points at which I disagree with it, and points at which it taught me something. At the end of the day: if the institutional rise of the Digital Humanities meant lots of people doing what he does, and doing it as well as he does, then I wouldn't have had any interest in writing against it. Unfortunately, a large part of what it means is people teaching undergrads AntConc without teaching them any linguistics and teaching them Gephi without teaching them any sociology and teaching them JavaScript without teaching them any computer science.

  21. Computer concordancing history | Corpus Linguistics 4 EFL said,

    July 17, 2016 @ 8:52 pm

    […] was reading a very interesting Language Log post on Digital Humanities, and came across Michael Preston's "A brief history of computer concordances". […]

  22. Mark said,

    July 17, 2016 @ 10:46 pm

    really excellent, informative post. I had never thought of the history of computer technology in that decade-wise way, but it seems pretty accurate to me. I had read and enjoyed "Small World" when it came out, but hadn't retained the "Centre for Computational Stylistics" section. Didn't have a schema to assimilate it into! Thanks for enlightening post.

  23. Benjamin Massot said,

    July 18, 2016 @ 4:27 am

    I appreciate this discussion very much (and I tend to adopt Allington, Brouillette and Golumbia's views). Thanks to all participants!

    Quoting Myl: "research they wanted to do anyhow" […] "do old things more efficiently, … do new things that were previously not feasible"

    This is another key point to making up my mind about the question: how much did computers "rewrite" the research agenda? I have nothing more than impressionistic stuff, so I'll let others give concrete examples to begin exploring the question. Have some linguistic (linguistics is my own field, but I'd be interested for other fields) problematics been abandoned through the course of history because there is no digital way to explore them? And if so, why? It is because it's hard to raise fundings whitout promising computer use? Is it because less and less scholars want to "work by hand" anymore? Is it because we lose the capacity to come up with problematics that can't be treated digitally?

    [I mean "work by hand" although it's clear to me that most of it is done on a computer, but I mean "on" a computer, not "by" a computer, I hope I'm clear…]

  24. Ted Underwood said,

    July 18, 2016 @ 6:45 am

    Just a quick note to agree with much of what Daniel Allington says in the most recent comment above. "DH" is definitely more like a social movement or a set of institutions than it is like a well-defined idea or research project. He's right that those institutions are real things. (I might add that they seem to be very divided, internally — which could be a good thing or a bad thing, I don't know.)

    I also, sadly, recognize the pedagogical problem he describes: "people teaching undergrads AntConc without teaching them any linguistics and teaching them Gephi without teaching them any sociology and teaching them JavaScript without teaching them any computer science." Yep.

    Good work in this domain has to be interdisciplinary. It's hard to fit that breadth into an undergraduate major, and absolutely impossible to fit it into a single "Intro to DH" course — which, of course, is what literature departments are currently dreaming they can do. The result is often an unsatisfactory pedagogy centered on "tools" (even or especially when it teaches skepticism about tools) rather than centered on the social phenomena we're all trying to understand. I understand why that has happened — as a stopgap response to a rapidly-changing scene — but I don't think it can continue. A viable curriculum will have to be explicitly interdisciplinary instead of trying to squeeze other disciplines into an English dept in the form of tools and "DH." About that, I really agree.

  25. Alan Farahani said,

    July 18, 2016 @ 9:47 am

    I’d like to add a bit of a follow-up on one particular thread that emerged in the course of this excellent discussion, and which I hope isn’t too tangential. That thread concerns the “role of computers in arch[a]eology”,which has been invoked at various points of this discussion. As a linguist turned archaeologist, I find the differences between archaeology and literary studies as discussed here pertinent.

    My own reading of our disciplinary history, and I invite my colleagues to disagree with me here, is that there never has been a concern of a “DH” like infiltration or agenda realignment that goes against what are seen as the core interests of the discipline. Computers and computational methods have been whole-heartedly embraced. This is all the more surprising given that archaeology has experienced several periods of pronounced theoretical upheavals in the last 40 years which yielded little consensus on major philosophical issues (now since simmered out).

    Indeed even those with poststructuralist or postmodern theoretical commitments often see something like DH like as empowering or otherwise phenomenologically illuminating (see inter alia Ruth Tringham’s work on the archaeological site of Catalhoyuk being rendered in Second Life). As Mark predicted, there isn’t really a separate “Computers and Archaeology” because the assumption is that everyone is using these methods in the course of their work; whether for visualization, database management, object rendering, quantitative analysis, or any other myriad possible applications.

    Therefore one doesn’t see as much of the anxiety noted above that practitioners aren’t being grounded in the essentials before being introduced to the “flashy tools” that may be attractive to the market. And there is certainly occasional pushback in that direction. But again most see it as a positive development intellectually, e.g. adding meta and other interactive queryable data to maps to move beyond flat representations while enabling users to focus on elements of their choosing rather than those of the archeological cartographer. So archaeology publications routinely utilize many different computational technologies, regardless of theoretical orientation. The differences again, appear rather marked on the surface when compared with other disciplines such as literary studies above, although I won’t speculate as to the reasons why in this space.

  26. Lane said,

    July 18, 2016 @ 10:02 am

    Am I the only one confused by what is meant in this discussion by "corporatists"? I came up in political science, and we use the word in a very different way, basically this one:

    https://en.wikipedia.org/wiki/Corporatism

    Corporatism was a kind of social organisation where the state sponsors and keeps close all interest groups, like businesses and trade unions, and not only not synonymous with, but but contrasted with, "(neo)liberalism", in which interest groups form, join, divide and compete freely.

    In "neo-coporatism", the state and thee groups are cosy: labour, business and the state sit down together to hammer out deals. This is something we were taught to associate with highly regulated southern European economies like Italy.

    So it seems like — as with liberalism itself — this is a word that has come to have multiple meanings that are almost opposites.

  27. J.W. Brewer said,

    July 18, 2016 @ 11:53 am

    I think "corporatist" here is primarily in-group jargon. It serves the valuable sociolinguistic function of signalling membership in a certain sort of subculture. Members of the subculture use it to signal to each other, and the rest of us can treat it as a convenient heuristic for understanding their basic presumed social/political/etc point of view and crediting or discounting what they have to say accordingly.

  28. a George said,

    July 18, 2016 @ 6:50 pm

    I think that "analog" vs "digital" as an opposition mostly makes very little sense, almost irrespective of the academic field. "Digital" refers to numbers, and today mostly binary numbers, which are the true oppositions: each digit in those numbers can only assume one of two values when the condition is stable. There is a clean switch between one state and the other, at least in an ideal world. However once you are in the field where switches control how you represent information, you can use this representation very easily for counting, and binary is the simplest representation for machines to use. This representation is also very resistant to noise: it needs an effort to switch one digit from one state to the other, and it rarely happens by chance. Furthermore, it is not difficult to reconstruct a given intended state, because the difference between being switched on and switched off is so clear.

    However, once we begin to use the switches very frequently — technically called "using a high clock frequency" — we re-discover that the states are actually analog in nature. The voltage that represents the state is just as analog as those voltages that represent the instantaneous value of something. And with the speed comes the risk that the voltage values we want to represent our digital number are not quite reached, and so chance may influence the purported value after all.

    The idea of a continuum is necessary in a number of mathematical approaches. We reasonably think of an "analog" representation as a continuum, however, its opposition is not "digital" but "sampled". Instead of keeping track of the continuous values we only measure them at specific times — we take the liberty of throwing away any information that lies between our sampling instances. If done wisely, such as by obeying the rules of the sampling theorem, we shall have sufficient information so that we do not need to worry about anything missing.

    The mention in a neighbouring thread of the series of still images that we know as film is not really one such case, because physiological reasons, not mathematical, make our eyes believe that we see continuous movement. That is very obvious from the way that cartwheels have a tendency to rotate backwards despite the forward movement of the carriage. This is a demonstration of "alias" that occurs because of a too low sampling frequency.

    The eye is much easier to fool than the eye, and for this reason we had to wait for better part of a century for a sampled representation of sound that could meet the general criteria of audio fidelity.

    I am generally frustrated with the attempts to designate "digital" humanities as something distinct, with its own properties. In reality it is merely a democratization of the access to a lot of data and the replacement of manual processing and sorting with automatic means. I was somewhat taken aback when the concept "digital philology" was introduced, because I could not see that normal philological rules would not apply. However, in the audio world, "born digital" is a technical term for a sound recording that was first made directly to a digital medium (such as a memory card). In that group of recordings an increasing number are now made via an encoder directly to a data-reduced file (which we frequently know as mp3), and due to the artefacts that some original sounds combined with some reproducing equipment will present, there is indeed a need to know precisely what happens when you throw away data for physiological reasons (masking, etc.) by means of digital processing. Versions will have to be compared, and all the other tools of the trade still apply. Similarly with cheap digital cameras: you cannot get a raw format, only jpgs.

    Every time a new technology has been developed, there is a rush of dedicated scientists who take advantage of it. It happened when the linear time line on a recording instrument in the form of the kymograph was first taken advantage of, which opened the way for physiological studies as well as in phonetics and musical performance practice [Brock-Nannestad, G. (2014), The mechanization of performance studies, Early Music, Vol. xlii, No. 4, pp. 623-630, doi:10.1093/em/cau124].

  29. Y said,

    July 19, 2016 @ 2:08 pm

    As a case study for this divide, I suggest the recent back-and-forth regarding computational phylogeny, specifically the dating of Proto Indo European. Gray and Atkinson's 2003 study, based on novel computational methods, gave a date for PIE which was at odds with that obtained by more traditional methods, while not attempting to reconcile the two. That paper and subsequent ones implicitly attempt to supplant traditional methods with computational ones.
    I would contrast, on the one hand, historical linguists who use traditional methods but are open to suplementing them with computational methods; and on the other hand ones who would call for computational-based results as the primary ones. In that view, Gray and Atkinson's 2003 paper would be a "digital humanities" paper, whereas the recent paper of Chang et al. on the same subject would be fall under "traditional humanities", assisted by computational methods.

  30. Daniel Allington said,

    July 20, 2016 @ 4:36 am

    Ted wrote:

    I also, sadly, recognize the pedagogical problem [Daniel] describes…

    A viable curriculum will have to be explicitly interdisciplinary instead of trying to squeeze other disciplines into an English dept in the form of tools and "DH." About that, I really agree.

    I should point out that this is something David brought up in his LA Review of Books interview (which took place before the article to which the above post supposedly responds was published, but was published afterwards):

    Programming is great. The university has great resources for people to learn programming. Why English should become a place where one learns how to do that is beyond me. You can’t learn coding in a couple of workshops or classes. If you want to learn how to code, the university has huge resources available to help you do this. I don’t think Computer Science departments should teach how to read novels, and I don’t think that English should teach how to program.

    Some institutions are better set up for interdisciplinary teaching than others (mine, for example, audits workload in such a way that by letting 'your' students take classes taught by people from another subject area, you're potentially putting yourself or a subject area colleague in danger of being laid off). But I think there's another reason for what Ted calls 'an unsatisfactory pedagogy centered on "tools"… rather than centered on the social phenomena we're all trying to understand', which is that the narrative justifying the introduction of the tools isn't focused on the phenomena that we might use them to study but on the need to get with it and start using or teaching new tools because DH is the thing to do and doing DH means using tools. I don't want to point fingers because the people in question are junior, but I've seen blog posts that essentially say, 'I'm getting into DH. Maybe I should learn to code.' It should be more like: 'In my research, there's this repetitive task that I'm doing one text at a time – I wonder whether I could automate it and batch process a thousand texts?'

    That's really how I got into programming. There was something that I was doing in order to answer a question that I'd set myself, and I knew that a computer could do that thing not only faster but more reliably than I could do it 'by hand'. I assumed that I'd have to find an existing application that did the job, but after looking around for one, I realised that it didn't exist. Then, after a conference, I had a conversation that made me see that I could create the application myself.

    As Ted points out in the blogpost mentioned above, my training is in social science – and really, that's still where I am, even though I've been employed in an arts faculty for the last year. So that's the lens through which I tend to see things, and, from my point of view, research is supposed to work like this: you start with a theoretical problem, formulate a research question that engages with it, then choose (or design) a methodology to answer that research question, then choose (or create) a tool to assist with that methodology. In reality, it doesn't happen like that a lot of the time – but it feels important as an ideal that people at least try to stick to. And I think this is why archaeology (mentioned above) and linguistics (mentioned throughout) have assimilated digital technology without any particular anxiety. It's just a means to an end. I did a talk at Edinburgh a couple of years ago where I had a slide called 'Why Digital Social Science isn't a Thing', and then went on to argue that Digital Humanities shouldn't be a thing either – that what matters is formulating research questions and then answering them using appropriate methodologies, whatever those happen to be.

  31. Bill Benzon said,

    July 20, 2016 @ 7:09 am

    Mark Liberman observes (OP): "One difference is that hostility to computational methods has always been more widespread among literary scholars than in other disciplines."

    Why? I don't know, but I have an observation or two. Certainly by the early 1970s literary criticism was attempting to come to terms with generative grammar. You see it, for example, in Stanley Fish's 1972 essay, "Literature in the Reader: Affective Stylistics", which established him as an important theorist, and you see it in Jonathan Culler's 1975 Structuralist Poetics: Structuralism, Linguistics, and the Study of Literature. Culler talks of deep structure and develops a concept of literary competence, which is clearly modeled on Chomsky's grammatical competence. He also makes a methodological distinction between poetics, which is the study of how texts are made, and interpretation, which seeks the meaning of texts. Poetics is NOT about the meaning of texts.

    Culler did not, of course, invent the distinction between poetics and interpretation. Both had been around for some time, though the study of poetics is older than the activity of interpreting texts. Poetics, after all, goes back to Aristotle. Interpretation, well, the interpretation of sacred texts (the Bible) is old. But the academic practice of interpreting the canonical secular texts, that's relatively new. Though there are earlier roots of course, it didn't become routine in the academy until after WWII. Thus in an interview in the minnesota review (nos. 71-72, 2009), J. Hillis Miller (a first generation deconstructive critic) could observe: "The courses in literature at Harvard when I was there [graduate school, late 40s-early 50s], I would have to say, were very thin. None of these people, including Douglas Bush, really had any idea about how to talk about a poem, in my opinion." That is, they couldn't do a decent interpretation. He had to learn and figure it out on the job, at Johns Hopkins, in the 1950s and 60s. I don't know what kind of disciplinary fuss this caused, just how much the philologists, literary historians, and editors resisted interpretation, but there must have been some.

    But interpretation was well-established practice by the 1960s. But also problematic in that critics were not agreeing in their interpretations. What's up with that? Is there a method we can follow that will bring us into agreement or is that impossible because the meaning of literary texts is inherently manifold and indeterminate? That was the big question from the mid-1960s and well into the 1970s and beyond.

    In 1975 Geoffrey Hartman published The Fate of Reading. In the title essay Hartman asks (p. 272): “To what can we turn now to restore reading, or that conscious and scrupulous form of it we call literary criticism?” And by "reading" he meant interpretation. He goes on to say: “modern ‘rithmatics’—semiotics, linguistics, and technical structuralism—are not the solution. They widen, if anything, the rift between reading and writing.” And that's pretty much the direction the profession took. Culler's structuralist poetics was still-born and linguistics played no role in the study of literature. Humanistic computing was safely shuffled off to its own disciplinary ghetto.

  32. Ted Underwood said,

    July 21, 2016 @ 3:20 am

    I largely agree with Daniel Allington here:

    I did a talk at Edinburgh a couple of years ago where I had a slide called 'Why Digital Social Science isn't a Thing', and then went on to argue that Digital Humanities shouldn't be a thing either – that what matters is formulating research questions and then answering them using appropriate methodologies, whatever those happen to be.

    That's exactly why I say I do "distant reading" rather than "DH" — I want to focus attention on research questions, not on the digitalness of the digital methods I might (or might not) use to address them. This is something I'll say at more length in a LARB interview probably to come out next month, where I'm quite stubborn about refusing to debate "DH."

    But from my perspective, writing a LARB essay about how "digital humanities" makes you a "neoliberal tool" is not an effective way to foreground research questions. On the contrary, it tends to ensure that people persist in organizing themselves socially around this question of technophilia / technophobia.

    Look at the reception of the article. Did anyone — other than you, me, and Andrew Goldstone — take it as a call for more rigorous engagement with quantitative methods? No, they took it as an argument that humanists who use computers (and/or a specific social group of them called "DHers") are bad and neoliberal and probably wear short-sleeved button-down shirts. If we want people to do this stuff better, we have to tell them to do it better, and actively make it hard for people to fall back on lazy pro- or anti-digital affiliations that short-circuit substantive thinking.

    [(myl) Yes! But "short-sleeved button-down shirts"? My grasp of sartorial semiotics is clearly out of date.]

  33. Bill Benzon said,

    July 21, 2016 @ 6:14 am

    Hey, Ted! You forgot those plastic pocket protectors that keep a phalanx of pens from ruining your shirt.

    More seriously, though as something of an aside, it seems to me that in its very construction the phrase digital humanities was destined to become a bright shiny object that attracted some and repelled others almost without regard for its extension in the world. There is a substantial anti-science anti-technology line of thinking in the humanities that goes back at least to the Romantics. Digital humanities proclaims a species of humanities that is conceived on the side of science&technology. It is thus different in its effect from humanities computing, which subordinates computing to humanities. Computing, yes, but computing in service to the humanities; we can live with that. But humanities that is born digital, is that even possible? Maybe it's a miracle that will save us or; or maybe it's an abomination that's a sign of the coming End Times.

    Compare lines 35 and 36 of "Kubla Khan":

    It was a miracle of rare device,
    A sunny pleasure-dome with caves of ice!

    Miracles have a very different kind of causal structure from devices, even rare ones. Something that partakes of both is strange indeed. The digital humanities lab would hardly seem to be a sunny pleasure-dome with caves of ice, but who knows.

RSS feed for comments on this post