digital culture through file types

This is a fabulous idea by Mark Sample: studying digital culture through file types. He mentions MP3, GIF, HTML, and JSON, but of course there are many others worthy of attention. Let me mention just two:

XML: XML is remarkably pervasive, providing the underlying document structure for things ranging from RSS and Atom feeds to office productivity software like Microsoft Office and iWork — but secretly so. That is, you could make daily and expert use of a hundred different applications without ever knowing that XML is at work under the hood.

Text: There’s a great story to be told about how plain text files went from being the most basic and boring of all file types to a kind of lifestyle choice — a lifestyle choice I myself have made.

If you have other suggestions, please share them here or with Mark.

Critiquing the Critique of Digital Humanities

Disclosure: IANADH (I am not a digital humanist), but I did get my PhD from the University of Virginia.

There’s a good deal of buzzing in the DH world about this critique of the field by Daniel Allington, Sarah Brouillette, David Golumbia — hereafter ABG — whose argument is that DH’s “most significant contribution to academic politics may lie in its (perhaps unintentional) facilitation of the neoliberal takeover of the university.” Let me do a little buzzing of my own, in three bursts.

Burst the First: In the early stages of the essay, ABG claim that the essential problem with DH, the problem that makes it either vulnerable to co-optation by the neoliberal regime or eagerly complicit in it, is its refusal to see interpretation as the essential activity of literary study. This refusal of interpretation, in ABG’s view, is the key enabler of creeping university-based neoliberalism, in literary studies anyway.

And here’s where the argument takes an odd turn. “It is telling that Digital Humanities … has found an institutional home at the University of Virginia.” In ABG’s account, UVA is the academic version of the headquarters of Hydra, an analogy I wish I had not thought of, because now I’m casting the roles: Jerome McGann as Red Skull — that one’s obvious — Bethany Nowviskie as Viper, … but I digress.

Anyway, ABG say that the strong digital-humanities presence at UVA makes sense because of a long institutional history of refusing the centrality of interpretation, starting with Fredson Bowers, the textual scholar who fifty years ago began building UVA’s English department into a world-class one — textual criticism being one of those modes of humanistic scholarship that de-emphasizes interpretation. (Or outright rejects it, if you’re, say, A. E. Housman.) ABG then add to Bowers another problematic figure, E. D. Hirsch — but wait, didn’t Hirsch make his name by writing about hermeneutics, in Validity in Interpretation and The Aims of Interpretation? Yes, say ABG, but Hirsch had a “tightly constrained” model of interpretation, so he doesn’t count. Similarly, though it would seem that the work of Rita Felski is essentially concerned with interpretation, she has suggested that there are limits to a posture of critique so she goes in the anti-interpretation camp also.

It would appear, then, that for ABG, narrow indeed is the path that leads to hermeneutical salvation, and wide is the way that leads to destruction. But an approach that credits scholars for being interested in interpretation only if they follow an extremely strict — and yet unspecified — model of that practice is just silly. ABG really need to go back to the drawing board here and make their conceptual framework more clear.

While they’re at it, they might also ask how UVA ended up hiring people like Rita Dove and Richard Rorty and Jahan Ramazani, and allowing New Literary History — perhaps the most prominent journal of literary interpretation and critique of the past fifty years — to be founded and housed there. Quite an oversight by the supervillains at Hydra.

Burst the Second: ABG write,

While the reading lists and position statements with which the events were launched make formal nods toward the importance of historical, sociological, and philosophical approaches to science and technology, the outcome was the establishment, essentially by fiat, of Digital Humanities as an academic and not a support field, with the accompanying assertion that technical and managerial expertise simply was humanist knowledge.

This notion, they say, “runs counter to the culture not only of English departments but also of Computer Science departments,” is tantamount to “the idea that technical support is the cutting edge of the humanities,” and “carried to its logical conclusion, such a declaration would entail that the workers in IT departments,” including IT departments of Big Corporations, “are engaged in humanities scholarship.”

Quelle horreur! That’s about as overt an act of boundary-policing as I have seen in quite some time. Get back in “technical support” where you people belong! And stop telling me to reboot my computer! I’ll just offer one comment followed by a question. There is a long history, and will be a long future, of major scientific research being done at large corporations: Claude Shannon worked for Bell Labs, to take but one crucial example, and every major American university has been deeply entangled with the military-industrial complex at least since World War II. Think for instance of John von Neumann, who spent several years traveling back and forth between Princeton’s Institute for Advanced Study and the Atomic Energy Commission. Do ABG really mean to suggest that in all this long entanglement of universities with big business and government the humanities managed to maintain their virginity until DH came along?

And lest you suspect that these entanglements are the product of 20th-century America, please read Chad Wellmon on Big Humanities in 19th-century Germany. The kindest thing one could say about the notion that the “neoliberal takeover of the university” is just happening now — and that it’s being spearheaded by people in the humanities! — is to call it historically uninformed.

Burst the Third: The core of ABG’s argument: “Digital Humanities as social and institutional movement is a reactionary force in literary studies, pushing the discipline toward post-interpretative, non-suspicious, technocratic, conservative, managerial, lab-based practice.” This argument is based on a series of, to put it charitably, logically loose associations: many vast multinational corporations rely on digital technologies and “lab-based practice,” and DH does too, ergo…. But one could employ a very similar argument to say that many vast multinational corporations rely on scholars trained in the interpretation of texts — legal texts, primarily — and therefore it is “reactionary” to continue to produce expertise in these very practices. (Think of how many English majors trained in the intricacies of postcolonial critique have ended up in corporate law.) ABG have basically produced a guilt-by-association argument, but one which works against the scholarly models they prefer at least as well as it works against DH.

In fact: much better than it works against DH. What do we have more of in the humanities today: digital humanists, or people who fancy themselves critics of the neoliberal social order but who rely all day every day on computer hardware and software made by Big Business (Apple, Microsoft, Google, Blackboard, etc.)? Clearly the latter dramatically outnumber the former. There are many ways one might defend DH, but one of my favorite elements of the movement is its DIY character: people trained in the basic disciplines of DH learn how to get beyond the defaults imposed by the big technology companies and make our computing machines work for our purposes rather than those of the giant tech companies.

I am not sure to what extent I want to see neoliberalism vanquished, because I am not sure what neoliberalism is. But if there is any idea that has been conclusively refuted by experience, it is that the reactionary forces of late capitalism can be defeated by humanistic critique and “radical” interpretative strategies. If I shared ABG’s politics, I think I’d want to seek collaboration with DH rather than sneer at it.

Carr on Piper on Jacobs

Here’s Nick Carr commenting on the recent dialogue at the Infernal Machine between me and Andrew Piper:

It’s possible to sketch out an alternative history of the net in which thoughtful reading and commentary play a bigger role. In its original form, the blog, or web log, was more a reader’s medium than a writer’s medium. And one can, without too much work, find deeply considered comment threads spinning out from online writings. But the blog turned into a writer’s medium, and readerly comments remain the exception, as both Jacobs and Piper agree. One of the dreams for the web, expressed through a computer metaphor, was that it would be a “read-write” medium rather than a “read-only” medium. In reality, the web is more of a write-only medium, with the desire for self-expression largely subsuming the act of reading. So I’m doubtful about Jacobs’s suggestion that the potential of our new textual technologies is being frustrated by our cultural tendencies. The technologies and the culture seem of a piece. We’re not resisting the tools; we’re using them as they were designed to be used.

I’d say that depends on the tools: for instance, this semester I’m having my students write with CommentPress, which I think does a really good job of preserving a read-write environment — maybe even better, in some ways, than material text, though without the powerful force of transcription that Andrew talks about. (That may be irreplaceable — typing the words of others, while in this respect better than copying and pasting them, doesn’t have the same degree of embodiment.)

In my theses I tried to acknowledge both halves of the equation: I talked about the need to choose tools wisely (26, 35), but I also said that without the cultivation of certain key attitudes and virtues (27, 29, 33) choosing the right tools won’t do us much good (36). I don’t think Nick and I — or for that matter Andrew and I — disagree very much on all this.

How Uninformed Critiques of Digital Humanities Are Taking Over Journalism!

This essay by Catherine Tumber is disappointingly empty, but also indicative of a certain and all-too-common mode of thought. It seems that Tumber has read almost nothing in the digital humanities except Adam Kirsch’s recent critique of that multifaceted movement, and — remarkably enough! — she agrees with Kirsch, “whom we can thank for reading these books so we don’t have to,” adding nothing of her own to his arguments, except the evidence of what appears to be half an hour of web browsing.

She assures us that in his treatment “Kirsch does not cherry pick; he plucks work by leading theorists in the field.” But one of the most common modes of intellectual cherry-picking is taking passages or ideas out of their context, and Tumber, who as we have just seen has not read the books in question, is scarcely in a position to judge whether Kirsch has done that or not. Some of the leading figures in DH — in a response which, though it was published in the same journal that published Kirsch’s critique, Tumber seems unaware of — make it clear that his treatment of their ideas grossly misrepresents them: 

Third, the notion that so called “digital humanities” is characterized by an urge “to accelerate the work of thinking by delegating it to a computer” is patently nonsensical. Throughout Digital_Humanities we argue not “to throw off the traditional burden” but, on the contrary, for a critical and transformative engagement that is rooted in the very traditions of humanistic inquiry. If Kirsch did some close-reading of the book, he would find it to be a celebration not of the digital—as some starry-eyed salvific or materialist ideology—but of the vitality and necessity of the humanities.

Having read the book, I think their statement is quite accurate. But don’t take my word for it: read it yourself. You’ll be a big step ahead of Catherine Tumber.

Here’s what we could use more of in this debate: 

1. Reading a lot before critiquing, in the spirit of intellectual responsibility

2. Remembering that many of the approaches to literary study we’re familiar with were themselves attacked as anti-humanistic just a couple of decades ago. 

Here’s what we could use less of in this debate: 

1. Critiquing without doing much reading. 

2. Presenting your lack of interest in a particular intellectual approach, or set of approaches, as a sign of virtue or humanistic integrity. It’s okay not to be interested in everything that everyone else is doing; we don’t need so to exalt our preferences for something else. 

3. Stupid clickbaity headlines. “Technology is Taking Over English Departments”? “Bulldozing the Humanities”? Give me a break. 

DH in the Anthropocene

This talk by Bethany Nowviskie is extraordinary. If you have any interest in where the digital humanities — or the humanities more generally — might be headed, I encourage you to read it. 

It’s a very wide-ranging talk that doesn’t articulate a straightforward argument, but that’s intentional, I believe. It’s meant to provoke thought, and does. Nowviskie’s talk originates, it seems to me, in the fact that so much work in the digital humanities revolves around problems of preservation. Can delicate objects in our analog world be properly digitized so as to be protected, at least in some senses, from further deterioration? Can born-digital texts and images and videos be transferred to other formats before we lose the ability to read and view them? So much DH language, therefore, necessarily concerns itself with concepts connecting to and deriving from the master-concept of time: preservation, deterioration, permanence, impermanence, evanescence. 

For Nowviskie, these practical considerations lead to more expansive reflections on how we — not just “we digital humanists” but “we human beings” — understand ourselves to be situated in time. And for her, here, time means geological time, universe-scale time. 

Now, I’m not sure how helpful it is to try to think at that scale. Maybe the Long Now isn’t really “now” at all for us, formed as we are to deal with shorter frames of experience. I think of Richard Wilbur’s great poem “Advice to a Prophet”

Spare us all word of the weapons, their force and range,
The long numbers that rocket the mind;
Our slow, unreckoning hearts will be left behind,
Unable to fear what is too strange.
Nor shall you scare us with talk of the death of the race.
How should we dream of this place without us? —
The sun mere fire, the leaves untroubled about us,
A stone look on the stone’s face?

Maybe thinking in terms too vast means, for our limited minds, not thinking at all. 

But even as I respond in this somewhat skeptical way to Nowviskie’s framing of the situation, I do so with gratitude, since she has pressed this kind of serious reflection about the biggest questions upon her readers. It’s the kind of thing that the humanities at their best always have done. 

So: more, I hope, at another time on these themes. 

different strokes

Here’s a typically smart and provocative reflection by Andrew Piper. But I also have a question about it. Consider this passage: 

Wieseltier’s campaign is just the more robust clarion call of subtler and ongoing assumptions one comes across all the time, whether in the op-eds of major newspapers, blogs of cultural reviews, or the halls of academe. Nicolas Kristof’s charge that academic writing is irrelevant because it relies on quantification is one of the more high-profile cases. The recent reception of Franco Moretti’s National Book Critics Award for Distant Reading is another good case in point. What’s so valuable about Moretti’s work on quantifying literary history, according to the New Yorker’s books blog, is that we can ignore it. “I feel grateful for Moretti,” writes Joshua Rothman. “As readers, we now find ourselves benefitting from a division of critical labor. We can continue to read the old-fashioned way. Moretti, from afar, will tell us what he learns.”
We can continue doing things the way we’ve always done them. We don’t have to change. The saddest part about this line of thought is this is not just the voice of journalism. You hear this thing inside academia all the time. It (meaning the computer or sometimes just numbers) can’t tell you what I already know. Indeed, the “we already knew that” meme is one of the most powerful ways of dismissing any attempt at trying to bring together quantitative and qualitative approaches to thinking about the history of ideas.
As an inevitable backlash to its seeming ubiquity in everyday life, quantification today is tarnished with a host of evils. It is seen as a source of intellectual isolation (when academics use numbers they are alienating themselves from the public); a moral danger (when academics use numbers to understand things that shouldn’t be quantified they threaten to undo what matters most); and finally, quantification is just irrelevant. We already know all there is to know about culture, so don’t even bother.

Regarding that last sentence: the idea that “we already know all there is to know about culture, so don’t even bother” is a pathetic one — but that’s not what Rothman says. Rather, he writes of a “division of labor,” in which it’s perfectly fine for Moretti to do what he does, but it’s also perfectly fine for Rothman to do what he does. What I hear Rothman saying is not “we know all there is to know” but rather something like “I prefer to keep reading in more traditional and familiar ways and I hope the current excitement over people like Moretti won’t prevent me from doing that.” 

In fact, Rothman, as opposed to the thoroughly contemptuous Wieseltier, has many words of commendation for Moretti. For instance: 

The grandeur of this expanded scale gives Moretti’s work aesthetic power. (It plays a larger role in his appeal, I suspect, than most Morettians would like to admit.) And Moretti’s approach has a certain moral force, too. One of the pleasures of “Distant Reading” is that it assembles many essays, published over a long period of time, into a kind of intellectual biography; this has the effect of emphasizing Moretti’s Marxist roots. Moretti’s impulses are inclusive and utopian. He wants critics to acknowledge all the books that they don’t study; he admires the collaborative practicality of scientific work. Viewed from Moretti’s statistical mountaintop, traditional literary criticism, with its idiosyncratic, personal focus on individual works, can seem self-indulgent, even frivolous. What’s the point, his graphs seem to ask, of continuing to interpret individual books—especially books that have already been interpreted over and over? Interpreters, Moretti writes, “have already said what they had to.” Better to focus on “the laws of literary history”—on explanation, rather than interpretation.
All this sounds austere and self-serious. It isn’t. “Distant Reading” is a pleasure to read. Moretti is a witty and welcoming writer, and, if his ideas sometimes feel rough, they’re rarely smooth from overuse. I have my objections, of course. I’m skeptical, for example, about the idea that there are “laws of literary history”; for all his techno-futurism, Moretti can seem old-fashioned in his eagerness to uncover hidden patterns and structures within culture. But Moretti is no upstart. He is patient, experienced, and open-minded. It’s obvious that he intends to keep gathering data, and, where it’s possible, to replace his speculations with answers. In some ways, the book’s receiving an award reflects the role that Moretti has played in securing a permanent seat at the table for a new critical paradigm—something that happens only rarely.

This all seems eminently fair-minded to me, even generous. But what Moretti does is not Rothman’s thing. And isn’t that okay? Indeed, hasn’t that been the case for a long time in literary study: that we acknowledge the value in what other scholars with different theoretical orientations do, without choosing to imitate them ourselves? It mystifies me that Piper sees this as a Wieseltier-level dismissal. 

modernism, revision, literary scholarship

Hannah Sullivan’s outstanding book The Work of Revision came out last year and got less attention than it deserves — though here’s a nice article from the Boston Globe. My review of the book has just appeared in Books and Culture, but it’s behind a paywall — and why, you may ask? Because B&C needs to make ends meet, that’s why, and if you haven’t subscribed you ought to, post haste.

Anyway, here’s the link and I’m going to quote my opening paragraphs here, because they relate to themes often explored on this blog. But do find a way to read Sullivan’s book.

Once upon a time, so the village elders tell us, there reigned a gentle though rather dull king called Literary Criticism, who always wore tweed and spoke in a low voice. But then, on either a very dark or very brilliant day, depending on who’s telling the story, this unassuming monarch was toppled by a brash outsider named Theory, who dressed all in black, wore stylish spectacles, and spoke with a French accent. For a time it seemed that Theory would rule forever. But no king rules forever.

One can be neither definitive nor uncontroversial about such matters, given the chaotic condition of the palace records, but if I were in the mood to be sweeping, I would suggest that the Reign of Theory in Anglo-American literary study extended from approximately 1960 (Michel Foucault’s Madness and Civilization) to approximately 1997 (Judith Butler’s Excitable Speech: A Politics of the Performative). Its period of absolute dominance was rather shorter, from 1976 (Gayatri Spivak’s English translation of Jacques Derrida’s Of Grammatology) to 1989 (Stephen Greenblatt’s Shakespearean Negotiations: The Circulation of Social Energy in Renaissance England). Those were heady days.

The ascendance of Theory brought about the occlusion of a set of humanistic disciplines that had for a long time been central to literary study, especially the various forms of textual scholarship, from textual editing proper to analytical bibliography. To take but one institutional example: at one time the English department of the University of Virginia, under the leadership of the great textual scholar Fredson Bowers, had been dominant in these fields, but Bowers retired in 1975, and by the time I arrived at UVA as a new graduate student in 1980, almost no one on the faculty was doing textual scholarship, and I knew no students who were interested in it. This situation would begin to be rectified in 1986 with the hiring of Jerome McGann, who renewed departmental interest in these fields and played a role in bringing Terry Belanger’s Rare Book School from Columbia to Virginia (in 1992). Now Virginia is once more seen as a major player in textual scholarship, bibliography, the history of the book, and what was once called “humanities computing” — a field in which McGann was a pioneer — but is now more likely to be called “digital humanities.”

Theory is still around; but its skeptical, endlessly ramifying speculations can now seem little more than airy fabrications in comparison to the scrupulous study of material texts and the very different kind of scrupulosity required to write computer programs that data-mine texts. The European theorist in black has had to give way to new icons of (scholarly) cool. Literary textual scholarship is back: more epistemologically careful, aware of the lessons of theory, but intimately connected to traditions of humanistic learning that go back at least to Erasmus of Rotterdam in the 16th century — and maybe even Eusebius of Caesarea in the 4th.

my response to Adam Kirsch

In an essay that’s received a lot of critical response from my digital-humanist friends, Adam Kirsch writes,

The best thing that the humanities could do at this moment, then, is not to embrace the momentum of the digital, the tech tsunami, but to resist it and to critique it. This is not Luddism; it is intellectual responsibility. Is it actually true that reading online is an adequate substitute for reading on paper? If not, perhaps we should not be concentrating on digitizing our books but on preserving and circulating them more effectively. Are images able to do the work of a complex discourse? If not, and reasoning is irreducibly linguistic, then it would be a grave mistake to move writing away from the center of a humanities education.

I completely agree — at least, if I’m allowed to add a paragraph of my own.

The other best thing that the humanities could do at this moment, then, is not to embrace the reflexive distrust of the digital, but to resist it and to critique it. This is not technological triumphalism; it is intellectual responsibility. Is it actually true that reading on paper is intellectually superior to reading online? If not, perhaps we should be devoting as much attention to digitizing our books, and exploring them more imaginatively in their digital forms, as to the immensely valuable work of preserving and circulating our paper books, periodicals, and ephemera. Is reasoning irreducibly linguistic? (Moreover, is reasoning the only form of thinking? Also, are humanists concerned only with reasoning and thinking? Aesthetic experience is not, after all, fully and simply rational.) If not, and images are able to do the work of complex discourse — especially when they are created, as so often they are, in conjunction with words — then it would be a grave mistake not to complement our practices of reading and writing with an equally rigorous pursuit of visual modes of understanding and creation.

As Kirsch continues, “These are the kinds of questions that humanists ought to be well equipped to answer.” Damn right.

on documentation

This essay on scholarly documentation practices lays down some very useful principles — for some scholars working in some circumstances. Unfortunately, the author, Patrick Dunleavy, assumes a situation that doesn’t yet exist and may not for some time to come.

Dunleavy presents as normative, indeed nearly universal, a situation in which (a) scholarly publication is natively digital because we live in “the digital age” and (b) scholars are working with open-access or public-domain sources that are readily available online. When those two conditions hold, his recommendations are excellent. But they don’t always hold, and what he calls “legacy” documentation is in fact not a legacy condition for many of us, but rather necessary and normal.

For instance: Dunleavy says of page-number citations, “That is legacy referencing, designed solely to serve the interests of commercial publishers, and 90% irrelevant now to the scholarly enterprise.” I don’t yet have any data about my recent biography of the Book of Common Prayer — see, and use, the links on the right of this page, please — but for my previous book, The Pleasures of Reading in an Age of Distraction, codex sales have exceeded digital sales by a factor of 10. So my 90/10 split is the opposite of what Dunleavy asserts to be the case. It makes no sense for me to think of the overwhelming majority of my readers as inhabiting a “legacy” realm and to focus my attention on documenting for the other ten percent. Page numbers are still eminently relevant to me and my readers. Dunleavy claims that “pagination in the digital age makes no sense at all,” which may be true, if and when we get to “the digital age.”

Moreover, most of my scholarly work is on figures — currently W. H. Auden, C. S. Lewis, Simone Weil, and Jacques Maritain — whose work is still largely or wholly under copyright. So I have few open-access or public-domain options for citing them. And this, too, is a common situation for scholars.

Dunleavy is thus laying down supposedly universal principles that in fact apply only to some scholars in some disciplines. Which is why this tweet from Yoni Appelbaum is so apropos:

on the maker ethos

Reading this lovely and rather moving profile of Douglas Hofstadter I was especially taken by this passage on why artificial intelligence research has largely ignored Hofstadter’s innovative work and thought:

“The features that [these systems] are ultimately looking at are just shadows—they’re not even shadows—of what it is that they represent,” Ferrucci says. “We constantly underestimate—we did in the ’50s about AI, and we’re still doing it—what is really going on in the human brain.” 

The question that Hofstadter wants to ask Ferrucci, and everybody else in mainstream AI, is this: Then why don’t you come study it?

“I have mixed feelings about this,” Ferrucci told me when I put the question to him last year. “There’s a limited number of things you can do as an individual, and I think when you dedicate your life to something, you’ve got to ask yourself the question: To what end? And I think at some point I asked myself that question, and what it came out to was, I’m fascinated by how the human mind works, it would be fantastic to understand cognition, I love to read books on it, I love to get a grip on it”—he called Hofstadter’s work inspiring—“but where am I going to go with it? Really what I want to do is build computer systems that do something. And I don’t think the short path to that is theories of cognition.”

Peter Norvig, one of Google’s directors of research, echoes Ferrucci almost exactly. “I thought he was tackling a really hard problem,” he told me about Hofstadter’s work. “And I guess I wanted to do an easier problem.”

Here I think we see the limitations of what we might call the Maker Ethos in the STEM disciplines — the dominance of the T and the E over the S and the M — the preference, to put it in the starkest terms, for making over thinking.

An analogical development may be occurring in the digital humanities, as exemplified by Stephen Ramsay’s much-debated claim that “Personally, I think Digital Humanities is about building things. […] If you are not making anything, you are not…a digital humanist.” Now, I think Stephen Ramsay is a great model for digital humanities, and someone who has powerfully articulated a vision of “building as a way of knowing,” and a person who has worked hard to nuance and complicate that statement — but I think that frame of mind, when employed by someone less intelligent and generous than Ramsay, could be a recipe for a troubling anti-intellectualism — of the kind that has led to the complete marginalization of a thinker as lively and provocative and imaginative as Hofstadter.

All this to say: making is great. But so is thinking. And thinking is often both more difficult and, in the long run, more rewarding, for the thinker and for the rest of us.