technologies we don’t have an artistic language for (yet)

The super-cool Robin Sloan has a super-cool newsletter – only occasional, alas, but Robin has many irons in the fire these days. He even makes olive oil. But anyway, in the most recent edition of the newsletter, he makes in passing a fascinating point:

There’s something happening in fiction now, and to a degree in film and TV too: the time in which stories are set is scootching back, with writers fleeing to the safety of 1994 or 1987 or much earlier. Why? Because we didn’t have smart phones then. We didn’t have social media. The world didn’t have this shimmering overlay of internet which is, in a very practical way, hard to write about. Writers of novels and teleplays have well-developed tools for the depiction of drama in real space. Drama that plays out through our little pocket-sized screens is just as rich – but how do we show it? We’re now seeing film and TV figure this out in real-time. Novels have been (oddly?) less successful. Because digital action relies on so many Brands™, it feels risky and/or distasteful to send your narrative too deep into that realm. Who wants to be the person who called it wrong and wrote the Great MySpace Novel? (Actually, the Great MySpace Novel would be amazing. But see, that’s not now anymore! MySpace has stabilized into historical artifact. We can look at it; describe it; maybe even understand it. That’s not the case with the systems we’re using right now. We’re lost inside of them.)

Remember the first episode of Sherlock? Came out eight – yes, eight – years ago, and one of the most-discussed elements of the first episode was its use of texting. Sherlock texted and received texts all the time, and the content of those texts was regularly displayed our TV screens. For a thoughtful take on how the series did this, see this video essay on “Visual Writing in Sherlock” – visual writing that is by no means confined to the display of texting. I believe there’s general agreement that the makers of the series not only got this right but also used it to great dramatic, and sometimes comic, effect.

I don’t want to take Robin’s point too far, but I’m taken by the suggestion that a particular technology only becomes available for artistic representation when artists and audience are not “lost inside of it.” In this context it might be worth noting that Sherlock’s representation of texting happened right after the first widespread availability of smartphones, and therefore right after people began regularly interacting with the phones in non-textual ways (especially through photos and video). Sherlock‘s representation of visual writing is, then, what BlackBerry use looks like when you have an iPhone.

You know what else appeared in 2010? The Social Network – a movie about Facebook that showed up just when people were dismissing Facebook as uncool and turning instead to Twitter – and then to Instagram (which was also released in 2010, though it didn’t become huge right away).

One more artifact from that same year: Gary Shteyngart’s novel Super Sad True Love Story, much of which is told through emails.

So: what technologies are going to dominate the books and movies and TV shows of 2020?

Auden and Poggioli

This lovely remembrance by Sylvia Poggioli of her father, the literary scholar Renato Poggioli, features a letter to her father from W. H. Auden, and the handwritten poem he submitted for publication in the journal Professor Poggioli edited, Inventario. Sylvia Poggioli speaks of her discovery as “a true literary find,” but the letter might be better described as a biographical find, since the poem itself has not been unknown: it’s duly recorded in Bloomfield and Mendelson’s W. H. Auden: A Bibliography, 1924-1969 (1972).

Also, it’s not quite right to say, as Poggioli does, that “Auden later included these verses in a much longer piece, perhaps one of the most powerful poems of the mid-twentieth century, The Age of Anxiety”: he had already written the stanzas as part of The Age of Anxiety and was simply excerpting them for Poggioli’s journal, something he did with several other chunks of that longest of his poems. Auden truthfully told Renato Poggioli that it was an “unpublished poem,” but it would be published, along with the rest of The Age of Anxiety, just a few months later.

The thought that first comes to my mind when looking at the above image is the devout wish that Auden had always taken so much care to make his handwriting legible. Alas for my eyes, which have spent so many hours poring over his notebooks, he did not.

the Multigraph Collective and new avenues of humanistic scholarship

Allison Miller tells The Story of the Multigraph Collective, an academic group project that eventuated in a book called Interacting with Print: Elements of Reading in the Era of Print Saturation. I very much want to read the book, but for those interested in the economics of labor in the academy and its effects on scholarship, this part of Miller’s account is especially interesting:  

Being edited by so many other scholars, according to Paul Keen (Carleton Univ.), was unnerving but also “weirdly liberating. It gave us all a license to put our authorial sensitivities on hold and put our faith in this larger brainstorming process.”
Indeed, [Andrew] Piper too describes the endeavor as a “leap of faith,” since no one knew how the final work would be received by tenure and promotion committees or by UK Research Excellence Framework evaluators. One Multigraph Collective member, says Piper, was told that since there were 22 collaborators, the member’s work on Interacting with Print would count as 1/22 of a book—by word count, not even the equivalent of a journal article.
In the thick of it all, however, the process was thrilling. Hierarchies of academic rank and disciplinary territoriality dissolved in a shared commitment to the work. “This project fundamentally changed my ideas about what humanities scholarship could look like and what it could achieve,” says Porter. 

The whole situation is a reminder of the absurdity of the current tenure system, with its crude quantitative pseudo-metrics for assessing “productivity” — but also of the power of tenure. Those of us who have it need to be engaged in projects like Interacting with Print — projects that reconfigure and extend the character of humanistic scholarship (sometimes by renewing older scholarly modes). I’m displeased with myself for not doing more along these lines. 

the garden and the stream

I just came across this fascinating 2015 talk by Mike Caulfield and want to call attention to a couple of elements of it. 
1) the garden/stream distinction: 

The Garden is the web as topology. The web as space. It’s the integrative web, the iterative web, the web as an arrangement and rearrangement of things to one another.
Things in the Garden don’t collapse to a single set of relations or canonical sequence, and that’s part of what we mean when we say “the web as topology” or the “web as space”. Every walk through the garden creates new paths, new meanings, and when we add things to the garden we add them in a way that allows many future, unpredicted relationships….
In the stream metaphor you don’t experience the Stream by walking around it and looking at it, or following it to its end. You jump in and let it flow past. You feel the force of it hit you as things float by.
It’s not that you are passive in the Stream. You can be active. But your actions in there — your blog posts, @ mentions, forum comments — exist in a context that is collapsed down to a simple timeline of events that together form a narrative.
In other words, the Stream replaces topology with serialization. Rather than imagine a timeless world of connection and multiple paths, the Stream presents us with a single, time ordered path with our experience (and only our experience) at the center.

2) The difference between the Memex and the World Wide Web: 

So most people say this is the original vision of the web. And certainly it was the inspiration of those pioneers of hypertext.
But in reality it doesn’t predict the web at all. Not at all. The web works very little like this. It’s weird, because in our minds the web still works like this, but it’s a fiction.
Let’s look at some of the attributes of the memex.
Your machine is a library not a publication device. You have copies of documents is there that you control directly, that you can annotate, change, add links to, summarize, and this is because the memex is a tool to think with, not a tool to publish with.
And this is crucial to our talk here, because these abilities – to link, annotate, change, summarize, copy, and share — these are the verbs of gardening.
Each memex library contains your original materials and the materials of others. There’s no read-only version of the memex, because that would be silly. Anything you read you can link and annotate. Not reply to, mind you. Change. This will be important later.
Links are associative. This is a huge deal. Links are there not only as a quick way to get to source material. They aren’t a way to say, hey here’s the interesting thing of the day. They remind you of the questions you need to ask, of the connections that aren’t immediately evident.
Links are made by readers as well as writers. A stunning thing that we forget, but the link here is not part of the author’s intent, but of the reader’s analysis. The majority of links in the memex are made by readers, not writers. On the world wide web of course, only an author gets to determine links. And links inside the document say that there can only be one set of associations for the document, at least going forward.

“A tool to think with, not a tool to publish with” — this seems to me essential. I feel that I spend a lot of time trying to think with tools meant for publishing. 

is user interface design getting worse?

Hey everybody, sorry for the radio silence — I’ve been traveling, and will be traveling again soon, so I can’t promise regular posting for a while. But I’m hoping to get a few thoughts up here, starting with this: 
I’ve read two recent posts about computer interface design that really have me thinking. The first is this reflection by Riccardo Mori about using a first-generation iPad. Mori discovers that that original Apple tablet, despite its significant limitations in processing power in comparison to today’s machines, still works remarkably well. But he also, and this is the really interesting part, decides that some of the design choices made eight years ago (the first iPad came out in 2010) are actually superior to the ones being made today. This is true to a minor degree even with regard to the hardware — Mori finds the iPad 1 more pleasurable to hold than some later models, despite its greater weight and thickness — but he thinks that the design language of iOS 5, the last version of iOS that the original iPad can use, is in certain respects simply superior to the new language introduced in iOS 7 and largely persisting, though with some modifications, today.

when it comes to visuals it’s ultimately a matter of personal taste, but one thing iOS’s user interface possessed before iOS 7’s flattening treatment was consistence and more robust, coherent, stricter interface guidelines. Guidelines that were followed by third-party developers more closely, and the result was that under iOS 6 and earlier versions, third-party apps presented a user interface that was cleaner, more predictable, easier to navigate than what came afterwards, update after update. After iOS’s UI got flatter, when it came to designing apps, things got out of hand, in an ‘anything goes’ fashion.
There are apps today with poor discoverability, ambiguous controls, UI elements whose state or function isn’t immediately clear — i.e. you cannot tell whether they’re tappable or not simply by looking at them; whereas before iOS 7, a button looked like a button right away, and you didn’t have to explore an app by tapping blindly here and there. Spotify is the first example coming to mind: its early iOS and Mac clients were more usable and had a better interface.

Concluding this section of his posts, Mori writes:

During my trip down Interface Memory Lane these days with the iPad 1, I’ve stumbled on many other cases, and the result was always more or less the same: I found the old version of an app to have a more usable interface and a clearer interface language than its current counterpart. Despite all the pre-iOS 7 skeuomorphism, for many app interfaces of that time design was truly ‘how it works’. Today, more and more often (and it’s not only with iOS) I see examples where design is simply ‘how it looks’; attractive apps, but with ambiguous interface controls, poorly-designed UI architecture, and sometimes even with little to no accessibility, disregarding users with disabilities.

The second post is by Mark Wilson, who found himself using version 7 of the original Macintosh OS, issues way back in 1991 — and loving it. “Using an old Mac is pure zen.” Now, Wilson doesn’t suggest that that old interface could simply be implemented today; we ask too much of our computing devices today, and too many kinds of “much.”

But I do believe that the old Mac makes for a timely reminder that the digital age hasn’t always felt so frantic, or urgent, or overwhelming. And maybe, even if an old Mac interface isn’t the solution, we can view it as a subtle north star for its sensibilities, and how much it was able to accomplish with so little.

Few interface designers are indifferent to the needs of the user, but I can’t imagine that there are many for whom that is the first consideration. One way designers keep their jobs is by producing new designs, and in a corporate setting (like that of Apple) novelty helps get customers to update their hardware and software alike. And what kind of designer wouldn’t want the challenge of making the best use of increased processing power or display resolution?
So I don’t expect the desires and needs or users to be at the top of any designer’s priority list. But Lordy, how I wish it were a little higher than it is. Then perhaps the best elements of the work of earlier designers, people working under far greater constraints, could be recovered and redeployed. Because, as I never tire of saying, creativity arises from constraint and resistance. And it’s not clear to me that, from the user’s perspective, UI design for computing devices hasn’t been getting worse and worse for the past few years — with Apple leading the way in this sad category. 

reasons for decline

Alex Reid

From a national perspective, the number of people earning communications degrees (which was negligible in the heyday of English majors 50-60 years ago), surpassed the number getting English degrees around 20 years ago. Since then Communications has held a fairly steady share of graduates as the college population grew, while English has lost its share and in recent years even shrank in total number, as this NCES table records. In short, students voted with their feet and, for the most part, they aren’t interested in the curricular experience English has to offer (i.e. read books, talk about books, write essays about books). 

Scott Alexander

Peterson is very conscious of his role as just another backwater stop on the railroad line of Western Culture. His favorite citations are Jung and Nietzsche, but he also likes name-dropping Dostoevsky, Plato, Solzhenitsyn, Milton, and Goethe. He interprets all of them as part of this grand project of determining how to live well, how to deal with the misery of existence and transmute it into something holy.
And on the one hand, of course they are. This is what every humanities scholar has been saying for centuries when asked to defend their intellectual turf. “The arts and humanities are there to teach you the meaning of life and how to live.” On the other hand, I’ve been in humanities classes. Dozens of them, really. They were never about that. They were about “explain how the depiction of whaling in Moby Dick sheds light on the economic transformations of the 19th century, giving three examples from the text. Ten pages, single spaced.” 

So maybe — just maybe — it’s not “read books, talk about books, write essays about books” that’s the problem. 

“not to waver with the wavering hours”

I’ve just been teaching Horace’s Epistles, and it strikes me that Horace ought to be the man of our social-media moment — the man who shows us another and better way.

In the first of those Epistles, Horace writes to his patron Maecenas — the one who bought him his Sabine farm that allows him to escape the noise and frenetic activity of Rome — to describe what he’s up to:

… my ambition to advance myself
In the sort of project that, if carried out
Successfully, is good for anyone,
Whether rich or poor, and its failure is bound to be
Harmful to anyone, whether he’s young or old. 

This “project” is, he says, to “devote myself entirely to the study / Of what is genuine and right for me, / Storing up what I learn for the sake of the future.” (I am quoting from David Ferry’s wonderful translation.) He needs to be on his farm to pursue this project, because life in the city, with its constant stimulation, creates too much agitation. And as he writes to another friend, Julius Florus (I.3), “if you’re able to learn to do without / Anxiety’s chilling effect, you’ll be able to follow / The lead of wisdom up to the highest reaches.”

Later (I.18) he exhorts Lollius Maximus to “interrogate the writings of the wise,”

Asking them to tell you how you can
Get through your life in a peaceable tranquil way.
Will it be greed, that always feels poverty-stricken,
That harasses and torments you all your days?
Will it be hope and fear about trivial things,
In anxious alternation in your mind?
Where is it virtue comes from, is it from books?
Or is it a gift from Nature that can’t be learned?
What is the way to become a friend to yourself?
What brings tranquility? What makes you care less?
Honor? Or money? Or living your life unnoticed?
Whenever I drink from the cold refreshing waters
Of the little brook Digentia, down below
Our local hill town, what do you think I pray for?
“May I continue to have what I have right now,
Or even less, as long as I’m self-sufficient.
If the gods should grant me life, though just for a while,
May I live my life to myself, with books to read,
And food to sustain me for another year,
And not to waver with the wavering hours.” 

The “wavering hours” waver because they’re charged with the nervous energy that comes from a too-busy life, a life of agitation and anxiety. As a youth Horace studied philosophy in Athens, and there he would have learned about the inestimable value of ataraxia — a peaceable and tranquil spirit. Because if you don’t have that, then you become a victim of your circumstances — and, especially in our time, a victim of propaganda.

Reading old books is a very valuable thing, because it takes you out of the maelstrom of “current events”; and it’s especially valuable to read old books like those by Horace because they will tell you quite directly how vital it is for you to learn this lesson.

propaganda and social media

Reading Ellul on the massive and pervasive consequences of propaganda in the twentieth century, I found myself over and over again thinking: This is how social media work on us. For instance, that passage I quoted in my earlier post — “to the same extent that he lives on the surface of events and makes today’s events his life by obliterating yesterday’s news, he refuses to see the contradictions in his own life and condemns himself to a life of successive moments, discontinuous and fragmented” — seems even more true as a description of the person constantly on Twitter and Facebook. Many other passages gave me the same feeling:

Man, eager for self-justification, throws himself in the direction of a propaganda that justifies him and this eliminates one of the sources of his anxiety. Propaganda dissolves contradictions and restores to man a unitary world in which the demands are in accord with the facts…. For all these reasons contenporary man needs propaganda; he asks for it; in fact, he almost instigates it. (159, 160) 

Or this:

Propaganda is concerned with the most pressing and at the same time the most elementary actuality. It proposes immediate action of the most ordinary kind. It thus plunges the individual into the most immediate present, taking from him all mastery of his life and all sense of the duration or continuity of any action or thought. Thus the propagandee becomes a man without a past and without a future, a man who receives from propaganda his portion of thought and action for the day; his discontinuous personality must be given continuity from the outside, and thus makes the need for propaganda very strong. (187) 

Thus the very common type of Twitter user who expresses himself or herself almost completely in hashtags: pre-established units of affiliation and exclusion.

And yet — Russian bots and political operatives (who have turned themselves into bots) aside — social media lack the planned purposefulness intrinsic to propaganda. So they must be a different kind of thing, yes?

Yes and no. I think what social media produce is emergent propaganda — propaganda that is not directed in any specific and conscious sense by anyone but rather emerges, arises, from vast masses of people who have been catechized within and by the same power-knowledge regime. Think also about the idea I got from an Adam Roberts novel: the hivemind singularity. Conscious, intentional propaganda is so twentieth century. The principalities and powers are far more sophisticated now. I’ll be thinking more about this.

"a revisionist blizzard of alternative theories"

Tim Adams on the media in Putin’s Russia:

In this culture war, disinformation was critical. Russian TV and social media would create a climate in which news became entertainment, and nothing would quite seem factual. This surreal shift is well documented, but Snyder’s forensic examination of, for example, the news cycle that followed the shooting down of flight MH17 makes essential reading. On the first day official propaganda suggested that the Russian missile attack on the Malaysian plane had in fact been a bodged attempt by Ukrainian forces to assassinate Putin himself; by day two, Russian TV was promoting the idea that the CIA had sent a ghost plane filled with corpses overhead to provoke Russian forces.

The more outrageous the official lie was, the more it allowed people to demonstrate their faith in the Kremlin. Putin made, Snyder argues, his direct assault on “western” factuality a source of national pride. Snyder calls this policy “implausible deniability”; you hear it in the tone of the current “debate” around the Salisbury attack: Russian power is displayed in a relativist blizzard of alternative theories, delivered in a vaguely absurdist spirit, as if no truth on earth is really provable.

Social-media propaganda directed at Americans works the same way: in contrast to earlier forms of propaganda, which sought to arouse people to action by alerting them to new and previously unseen truths, this kind of propaganda is meant to be soporific: it seeks to make people indifferent to what’s true, incurious, and accepting of whatever addresses the emotions to which they are most fully enslaved.

Long ago William Golding wrote a witty little essay called “Thinking as a Hobby” in which he identifies three levels of thought. Grade-three thinking, “more properly, is feeling, rather than thought”; it is ”full of unconscious prejudice, ignorance, and hypocrisy.” Grade-two thinking — which Golding came to practice as an adolescent — “is the detection of contradictions…. Grade-two thinkers do not stampede easily, though often they fall into the other fault and lag behind. Grade-two thinking is a withdrawal, with eyes and ears open.” Grade-two thinking is shouting “FAKE NEWS” and asking people whether they always believe what they’re told by the lamestream media, or pulling out your ink pad and rubber stamp and stamping BIGOT or RACIST on people who don’t line up with you 100%. I would say that such behavior is not “lagging behind” so much as digging in your heels and refusing to move — which herds of animals do far more frequently than they stampede.

When grade-two thinking is challenged its perpetrator will typically fall back to grade-three, as David French discovered: ”The desire to think the best of Mr. Trump combined with the deep distaste for Democrats grants extraordinary power to two phrases: ’fake news’ and ’the other side is worse.’

I’m reminded of an encounter at my church. People know that I opposed both Mr. Trump and Mrs. Clinton. They often ask what I think of the president’s performance. My standard response: I like some things, I dislike others, but I really wish he showed better character. I don’t want him to lie. I said this to a sweet older lady not long ago, and she responded — in all sincerity — “You mean Trump lies?” “Yes,” I replied. “All the time.” She didn’t answer with a defense. She didn’t say “fake news.” We’d known each other for years, and she trusted my words.

For a moment, she seemed troubled. I wanted to talk more — to say that we can appreciate and applaud the good things he does, but we can’t ignore his flaws, we can’t defend his sins, and we can’t let him define the future of the Republican Party. But just then, her jaw set. I saw a flare of defiance in her eyes. She took a sip of coffee, looked straight at me, and I knew exactly what was coming next: “Well, the Democrats are worse.”

Jacques Ellul argued half-a-century ago that the purpose of propaganda is to “provide immediate incentives to action.” But propaganda that encourages us to dig in our heels, or just drift with the social-media current, is propaganda all the same. What remains absolutely essential from Ellul’s book is his understanding that the person “embroiled in the conflicts of his time” (49) is most vulnerable to propaganda — and he could not have imagined a society so locked into the current instant as we denizens of Social Media World are. I’m going to close this post with a long quotation from Ellul that was incisive in relation to his own time but is devastatingly accurate about ours. I’ve put some especially important passages in bold; and I’d like you to notice how Ellul anticipates Debord’s Society of the Spectacle. Here goes:

To the extent that propaganda is based on current news, it cannot permit time for thought or reflection. A man caught up in the news must remain on the surface of the event; be is carried along in the current, and can at no time take a respite to judge and appreciate; he can never stop to reflect. There is never any awareness — of himself, of his condition, of his society — for the man who lives by current events. Such a man never stops to investigate any one point, any more than he will tie together a series of news events. We already have mentioned man’s inability to consider several facts or events simultaneously and to make a synthesis of them in order to face or to oppose them. One thought drives away another; old facts are chased by new ones. Under these conditions there can be no thought. And, in fact, modern man does not think about current problems; he feels them. He reacts, but be does not understand them any more than he takes responsibility for them. He is even less capable of spotting any inconsistency between successive facts; man’s capacity to forget is unlimited. This is one of the most important and useful points for the propagandist, who can always be sure that a particular propaganda theme, statement, or event will be forgotten within a few weeks. Moreover, there is a spontaneous defensive reaction in the individual against an excess of information and — to the extent that he clings (unconsciously) to the unity of his own person — against inconsistencies. The best defense here is to forget the preceding event. In so doing, man denies his own continuity; to the same extent that he lives on the surface of events and makes today’s events his life by obliterating yesterday’s news, he refuses to see the contradictions in his own life and condemns himself to a life of successive moments, discontinuous and fragmented.

This situation makes the “current-events man” a ready target for propaganda. Indeed, such a man is highly sensitive to the influence of present-day currents; lacking landmarks, he follows all currents. He is unstable because he runs after what happened today; he relates to the event, and therefore cannot resist any impulse coming from that event. Because he is immersed in current affairs, this man has a psychological weakness that puts him at the mercy of the propagandist. No confrontation ever occurs between the event and the truth; no relationship ever exists between the event and the person. Real information never concerns such a person. What could be more striking, more distressing, more decisive than the splitting of the atom, apart from the bomb itself? And yet this great development is kept in the background, behind the fleeting and spectacular result of some catastrophe or sports event because that is the superficial news the average man wants. Propaganda addresses itself to that man; like him, it can relate only to the most superficial aspect of a spectacular event, which alone can interest man and lead him to make a certain decision or adopt a certain attitude. (46-47)

Maybe I should blog a read-through of Propaganda.

Edmund Wilson on Marxism

NewImage

I have just re-read, for the first time in decades, Edmund Wilson’s To the Finland Station — which, it appears, NYRB Classics has allowed to go out of print, which is nearly a tragedy. It is a truly remarkable book — it is difficult to imagine anyone of our own time (least of all a journalist) handling ideas with such assurance and such verve, seeing in them the kind of drama that we typically associate with action heroes. The structure, the pacing, the style — all are superb. Perhaps the best thing about the book is how it centers itself on Karl Marx himself, bookended by predecessors (Proudhon, Robert Owen) and successors (Lenin, Trotsky). As a portrait of Marx it has not, to my knowledge, been equalled.

Wilson’s Freudianism, though essentially wrong, is actually quite helpful to him in understanding the Marxists, because, as he rightly points out, the great deficiency of most Marxist analyses of society is their oversimplified picture of human motivation. There’s even a passage where Wilson seems to be anticipating the rise of modern behavioral psychology and especially the role it plays in understanding of economic behavior. ”Prices are the results of situations much more complex than any of these formulas, and complicated by psychological factors which economists seldom take into account.… Let us note the crudity of the psychological motivation which underlies the worldview of Marx. It is the shortcoming of economists in general that each one understands as a rule only one or two human motivations; psychology and economics have never yet got together in such a way as really to supplement one another” (294, 295).

On the psychology of Marx himself Wilson is especially acute. After tracing Marx’s lifelong near-poverty, and his struggles to provide for his family, and his embarrassment when one of his daughters had to hire herself out as a governess, and his constant dependence on his friend Engels to keep the Marxes out of the poor house — Engels, who worked as a manager in a factory owned by his arch-capitalist father — Wilson writes:

Such is the trauma of which the anguish and the defiance reverberate through Das Kapital. To point it out is not to detract from the authority of Marx’s work. On the contrary, in history as in other fields of writing, the importance of a book depends, not merely on the breadth of the view and the amount of information that has gone into it, but on the depths from which it has been drawn. The great crucial books of human thought – outside what are called the exact sciences, and perhaps something of the sort is true even here – always render articulate the results of fundamental new experiences to which human beings have had to adjust to themselves. Das Kapital is such a book. Marx has found in his personal experience the key to the larger experience of society, and identifies himself with that society. His trauma reflects itself in Das Kapital as the trauma of mankind under industrialism; and only so sore and angry a spirit, so ill at ease in the world, could have recognized and seen into the causes of the wholesale mutilation of humanity, the grand collisions, the uncomprehended convulsions, to which that age of great profits was doomed. (311-312) 

That is an extraordinarily rich and provocative reflection.

One final point, only tangential to Wilson’s narrative: he is also very good on the ways in which a conviction that one is on “the right side of history” compromises one’s ethics:

History, then, is a being with a definite point of view in any given period. It has a morality which admits of no appeal and which decrees that the exterminators of the Commune shall be regarded as wrong forever. Knowing best – knowing, that is, that we are right – we may allow ourselves to exaggerate and simplify. At such a moment the Marxism of Marx himself — and how much more often and more widely in the case of his less scrupulous disciples — departs from the rigorous method proposed by “scientific socialism.” (283)

Yep. I see it every day.