technologies we don’t have an artistic language for (yet)

The super-cool Robin Sloan has a super-cool newsletter – only occasional, alas, but Robin has many irons in the fire these days. He even makes olive oil. But anyway, in the most recent edition of the newsletter, he makes in passing a fascinating point:

There’s something happening in fiction now, and to a degree in film and TV too: the time in which stories are set is scootching back, with writers fleeing to the safety of 1994 or 1987 or much earlier. Why? Because we didn’t have smart phones then. We didn’t have social media. The world didn’t have this shimmering overlay of internet which is, in a very practical way, hard to write about. Writers of novels and teleplays have well-developed tools for the depiction of drama in real space. Drama that plays out through our little pocket-sized screens is just as rich – but how do we show it? We’re now seeing film and TV figure this out in real-time. Novels have been (oddly?) less successful. Because digital action relies on so many Brands™, it feels risky and/or distasteful to send your narrative too deep into that realm. Who wants to be the person who called it wrong and wrote the Great MySpace Novel? (Actually, the Great MySpace Novel would be amazing. But see, that’s not now anymore! MySpace has stabilized into historical artifact. We can look at it; describe it; maybe even understand it. That’s not the case with the systems we’re using right now. We’re lost inside of them.)

Remember the first episode of Sherlock? Came out eight – yes, eight – years ago, and one of the most-discussed elements of the first episode was its use of texting. Sherlock texted and received texts all the time, and the content of those texts was regularly displayed our TV screens. For a thoughtful take on how the series did this, see this video essay on “Visual Writing in Sherlock” – visual writing that is by no means confined to the display of texting. I believe there’s general agreement that the makers of the series not only got this right but also used it to great dramatic, and sometimes comic, effect.

I don’t want to take Robin’s point too far, but I’m taken by the suggestion that a particular technology only becomes available for artistic representation when artists and audience are not “lost inside of it.” In this context it might be worth noting that Sherlock’s representation of texting happened right after the first widespread availability of smartphones, and therefore right after people began regularly interacting with the phones in non-textual ways (especially through photos and video). Sherlock‘s representation of visual writing is, then, what BlackBerry use looks like when you have an iPhone.

You know what else appeared in 2010? The Social Network – a movie about Facebook that showed up just when people were dismissing Facebook as uncool and turning instead to Twitter – and then to Instagram (which was also released in 2010, though it didn’t become huge right away).

One more artifact from that same year: Gary Shteyngart’s novel Super Sad True Love Story, much of which is told through emails.

So: what technologies are going to dominate the books and movies and TV shows of 2020?

Auden and Poggioli

This lovely remembrance by Sylvia Poggioli of her father, the literary scholar Renato Poggioli, features a letter to her father from W. H. Auden, and the handwritten poem he submitted for publication in the journal Professor Poggioli edited, Inventario. Sylvia Poggioli speaks of her discovery as “a true literary find,” but the letter might be better described as a biographical find, since the poem itself has not been unknown: it’s duly recorded in Bloomfield and Mendelson’s W. H. Auden: A Bibliography, 1924-1969 (1972).

Also, it’s not quite right to say, as Poggioli does, that “Auden later included these verses in a much longer piece, perhaps one of the most powerful poems of the mid-twentieth century, The Age of Anxiety”: he had already written the stanzas as part of The Age of Anxiety and was simply excerpting them for Poggioli’s journal, something he did with several other chunks of that longest of his poems. Auden truthfully told Renato Poggioli that it was an “unpublished poem,” but it would be published, along with the rest of The Age of Anxiety, just a few months later.

The thought that first comes to my mind when looking at the above image is the devout wish that Auden had always taken so much care to make his handwriting legible. Alas for my eyes, which have spent so many hours poring over his notebooks, he did not.

the Multigraph Collective and new avenues of humanistic scholarship

Allison Miller tells The Story of the Multigraph Collective, an academic group project that eventuated in a book called Interacting with Print: Elements of Reading in the Era of Print Saturation. I very much want to read the book, but for those interested in the economics of labor in the academy and its effects on scholarship, this part of Miller’s account is especially interesting:  

Being edited by so many other scholars, according to Paul Keen (Carleton Univ.), was unnerving but also “weirdly liberating. It gave us all a license to put our authorial sensitivities on hold and put our faith in this larger brainstorming process.”
Indeed, [Andrew] Piper too describes the endeavor as a “leap of faith,” since no one knew how the final work would be received by tenure and promotion committees or by UK Research Excellence Framework evaluators. One Multigraph Collective member, says Piper, was told that since there were 22 collaborators, the member’s work on Interacting with Print would count as 1/22 of a book—by word count, not even the equivalent of a journal article.
In the thick of it all, however, the process was thrilling. Hierarchies of academic rank and disciplinary territoriality dissolved in a shared commitment to the work. “This project fundamentally changed my ideas about what humanities scholarship could look like and what it could achieve,” says Porter. 

The whole situation is a reminder of the absurdity of the current tenure system, with its crude quantitative pseudo-metrics for assessing “productivity” — but also of the power of tenure. Those of us who have it need to be engaged in projects like Interacting with Print — projects that reconfigure and extend the character of humanistic scholarship (sometimes by renewing older scholarly modes). I’m displeased with myself for not doing more along these lines. 

the garden and the stream

I just came across this fascinating 2015 talk by Mike Caulfield and want to call attention to a couple of elements of it. 
1) the garden/stream distinction: 

The Garden is the web as topology. The web as space. It’s the integrative web, the iterative web, the web as an arrangement and rearrangement of things to one another.
Things in the Garden don’t collapse to a single set of relations or canonical sequence, and that’s part of what we mean when we say “the web as topology” or the “web as space”. Every walk through the garden creates new paths, new meanings, and when we add things to the garden we add them in a way that allows many future, unpredicted relationships….
In the stream metaphor you don’t experience the Stream by walking around it and looking at it, or following it to its end. You jump in and let it flow past. You feel the force of it hit you as things float by.
It’s not that you are passive in the Stream. You can be active. But your actions in there — your blog posts, @ mentions, forum comments — exist in a context that is collapsed down to a simple timeline of events that together form a narrative.
In other words, the Stream replaces topology with serialization. Rather than imagine a timeless world of connection and multiple paths, the Stream presents us with a single, time ordered path with our experience (and only our experience) at the center.

2) The difference between the Memex and the World Wide Web: 

So most people say this is the original vision of the web. And certainly it was the inspiration of those pioneers of hypertext.
But in reality it doesn’t predict the web at all. Not at all. The web works very little like this. It’s weird, because in our minds the web still works like this, but it’s a fiction.
Let’s look at some of the attributes of the memex.
Your machine is a library not a publication device. You have copies of documents is there that you control directly, that you can annotate, change, add links to, summarize, and this is because the memex is a tool to think with, not a tool to publish with.
And this is crucial to our talk here, because these abilities – to link, annotate, change, summarize, copy, and share — these are the verbs of gardening.
Each memex library contains your original materials and the materials of others. There’s no read-only version of the memex, because that would be silly. Anything you read you can link and annotate. Not reply to, mind you. Change. This will be important later.
Links are associative. This is a huge deal. Links are there not only as a quick way to get to source material. They aren’t a way to say, hey here’s the interesting thing of the day. They remind you of the questions you need to ask, of the connections that aren’t immediately evident.
Links are made by readers as well as writers. A stunning thing that we forget, but the link here is not part of the author’s intent, but of the reader’s analysis. The majority of links in the memex are made by readers, not writers. On the world wide web of course, only an author gets to determine links. And links inside the document say that there can only be one set of associations for the document, at least going forward.

“A tool to think with, not a tool to publish with” — this seems to me essential. I feel that I spend a lot of time trying to think with tools meant for publishing. 

is user interface design getting worse?

Hey everybody, sorry for the radio silence — I’ve been traveling, and will be traveling again soon, so I can’t promise regular posting for a while. But I’m hoping to get a few thoughts up here, starting with this: 
I’ve read two recent posts about computer interface design that really have me thinking. The first is this reflection by Riccardo Mori about using a first-generation iPad. Mori discovers that that original Apple tablet, despite its significant limitations in processing power in comparison to today’s machines, still works remarkably well. But he also, and this is the really interesting part, decides that some of the design choices made eight years ago (the first iPad came out in 2010) are actually superior to the ones being made today. This is true to a minor degree even with regard to the hardware — Mori finds the iPad 1 more pleasurable to hold than some later models, despite its greater weight and thickness — but he thinks that the design language of iOS 5, the last version of iOS that the original iPad can use, is in certain respects simply superior to the new language introduced in iOS 7 and largely persisting, though with some modifications, today.

when it comes to visuals it’s ultimately a matter of personal taste, but one thing iOS’s user interface possessed before iOS 7’s flattening treatment was consistence and more robust, coherent, stricter interface guidelines. Guidelines that were followed by third-party developers more closely, and the result was that under iOS 6 and earlier versions, third-party apps presented a user interface that was cleaner, more predictable, easier to navigate than what came afterwards, update after update. After iOS’s UI got flatter, when it came to designing apps, things got out of hand, in an ‘anything goes’ fashion.
There are apps today with poor discoverability, ambiguous controls, UI elements whose state or function isn’t immediately clear — i.e. you cannot tell whether they’re tappable or not simply by looking at them; whereas before iOS 7, a button looked like a button right away, and you didn’t have to explore an app by tapping blindly here and there. Spotify is the first example coming to mind: its early iOS and Mac clients were more usable and had a better interface.

Concluding this section of his posts, Mori writes:

During my trip down Interface Memory Lane these days with the iPad 1, I’ve stumbled on many other cases, and the result was always more or less the same: I found the old version of an app to have a more usable interface and a clearer interface language than its current counterpart. Despite all the pre-iOS 7 skeuomorphism, for many app interfaces of that time design was truly ‘how it works’. Today, more and more often (and it’s not only with iOS) I see examples where design is simply ‘how it looks’; attractive apps, but with ambiguous interface controls, poorly-designed UI architecture, and sometimes even with little to no accessibility, disregarding users with disabilities.

The second post is by Mark Wilson, who found himself using version 7 of the original Macintosh OS, issues way back in 1991 — and loving it. “Using an old Mac is pure zen.” Now, Wilson doesn’t suggest that that old interface could simply be implemented today; we ask too much of our computing devices today, and too many kinds of “much.”

But I do believe that the old Mac makes for a timely reminder that the digital age hasn’t always felt so frantic, or urgent, or overwhelming. And maybe, even if an old Mac interface isn’t the solution, we can view it as a subtle north star for its sensibilities, and how much it was able to accomplish with so little.

Few interface designers are indifferent to the needs of the user, but I can’t imagine that there are many for whom that is the first consideration. One way designers keep their jobs is by producing new designs, and in a corporate setting (like that of Apple) novelty helps get customers to update their hardware and software alike. And what kind of designer wouldn’t want the challenge of making the best use of increased processing power or display resolution?
So I don’t expect the desires and needs or users to be at the top of any designer’s priority list. But Lordy, how I wish it were a little higher than it is. Then perhaps the best elements of the work of earlier designers, people working under far greater constraints, could be recovered and redeployed. Because, as I never tire of saying, creativity arises from constraint and resistance. And it’s not clear to me that, from the user’s perspective, UI design for computing devices hasn’t been getting worse and worse for the past few years — with Apple leading the way in this sad category. 

Reflections on Treating the Poor

It is altogether curious your first contact with poverty. You have thought
so much about poverty — it is the thing you have feared all your life, the
thing you knew would happen sooner or later; and it is all so utterly and
prosaically different. You thought it would be quite simple; it is
extraordinarily complicated. You thought it would be terrible; it is merely
squalid and boring. It is the peculiar lowness of poverty that you
discover first; the shifts that it puts you to, the complicated meanness,
the crust-wiping.

– George Orwell, Down and Out in Paris and London


George Orwell’s 1933 memoir, Down and Out in Paris and London,
relates the clear-eyed experience of being homeless and penniless. The
novel’s protagonist lives in Paris giving English lessons and eventually
experiences a stroke of bad fortune and loses his job; money slowly but
surely disappears. He is overcome with “a feeling of relief, almost of
pleasure, at knowing yourself at last genuinely down and out.”

Imagine, Orwell asks of us, what this bad fortune means. You cannot send
letters because stamps are too expensive. At the baker, an ordered pound of
bread weighs in slightly more and thus costs slightly more — and you cannot
pay for it. You avoid “a prosperous friend” on the street so he
won’t see that you’re “hard up.” And you’re hungry. Wherever you walk there
are inescapable reminders of this: bakeries, restaurants, coffee shops.
“Hunger,” Orwell writes, “reduces one to an utterly spineless, brainless
condition, more like the after-effects of influenza than anything else.” Months pass by in between baths. Clothing is pawned. In the midst
of this scramble to live, however, one forgets that there is, indeed, a lot
of time with nothing to do at all: “you discover the boredom which is
inseparable from poverty; the times when you have nothing to do and, being
underfed, can interest yourself in nothing.”

Orwell based such descriptions largely on personal experiences. In 1927 he
spent time in the company of tramps and beggars in London, dressed in
worn-out clothing and sleeping in poor lodging-houses for two or three
days. He subsequently moved to Paris and subjected
himself to similar experiences. In doing so, he eventually brought
attention to the plight of the poor, providing an honest, unvarnished look
at what it was like to be down and out.

~

Rereading the book reminds me of Bellevue Hospital, New York City’s
flagship public hospital. Bellevue, or its progenitor, was originally an
infirmary in Manhattan in the 1660s and became the most well-known of the
public hospitals in the country (I have

written about it
 for Public Discourse). Here physicians treat the uninsured, the
undocumented, and the homeless. It is a rare day when a physician at
Bellevue does not interact with New York’s poorest residents.

Jim Henderson (Creative Commons)

Sometimes they come in search of medical care and sometimes they come in
search of a meal. They stumble in from homeless shelters or from street
corners, inebriated, withdrawing from drugs or alcohol, psychotic,
suicidal, deathly ill or sober. Occasionally they unknowingly enter the
emergency room with lice or bedbugs and nurses delouse them with multiple
layers of permethrin, an insecticide. The physician must approach these infested patients with
a hairnet, gown, and gloves — the lice crawl on the patient’s head, chest,
arms and bed sheets. The smell sometimes overwhelms the doctor or nurse,
too. It may have been months since the patient has bathed, and the odor
percolates throughout the room and the hallway.

As I wrote in my Public Discourse piece, the patient presentations
are frightening and remarkable:

Ride the elevator down, and you will stare in horror as an agitated drug
addict with an infection tries to punch a physician while bolting out of
his hospital room with security guards and nurses in pursuit. Next door, a
homeless patient lies in bed with heart failure. Next to him is a patient
who’s visiting New York from Africa with a raging AIDS infection. Peer into
another room down the hall, and you can watch patients withdrawing from
alcohol or heroin, thrashing about and screaming.

Physicians have the unique privilege at Bellevue to see poverty up close,
which so rarely occurs in upper and middle class professions. But as close
as we are, we don’t really understand the poor the way Orwell did. We don’t
live amongst them or feel the curse of extended hunger or the uncertainty
of when the next meal will come. We don’t experience that odd sensation of
boredom, where there is nothing to do because one has nothing to do it
with. And we cannot fully empathize with their fragile health.

~

This is why Orwell’s book is so enlightening. At least we get a description
of what some of Bellevue’s patients may go through; at least we get a
glimpse. It creates a little less space between the comfortable and the
impoverished.

But Orwell wasn’t wholly right about the poor. He wrote in Down and Out:

The mass of the rich and the poor are differentiated by their incomes and
nothing else, and the average millionaire is only the average dishwasher
dressed in a new suit. Change places, and handy dandy, which is the
justice, which is the thief? Everyone who has mixed on equal terms with the
poor knows this quite well. But the trouble is that intelligent, cultivated
people, the very people who might be expected to have liberal opinions,
never do mix with the poor.

True, there is a closeness between “intelligent, cultivated people” and the
“poor” simply by virtue of being human. However, there are deep differences
that would not disappear if the two simply switched jobs and clothing. For
instance, in 2016,
four percent of U.S. adults experienced a “serious mental illness.” This did not cover patients without
fixed addresses — the homeless. And     
approximately one fifth of the homeless in the United States
 suffer from a severe mental illness. Even if the definitions of “severe”
and “serious” don’t match up precisely, the difference between mental
illness among the homeless and other US adults is huge. And these
differences matter both to policy analysts and to physicians.

Two epidemiologists, Elizabeth Bradley and Lauren Taylor, have written a
thoughtful book dealing with the issue of rising health care costs entitled The American Health Care Paradox. In it they argue that our
skyrocketing health care expenditures (we spend

more than double the share of GDP
 of other developed countries on health care) and poor outcomes (we are in the high 20s or low 30s in rankings among OECD developed countries for maternal mortality, life expectancy, low birth weight, and
infant mortality) are not due to
overspending, but rather to underspending by the United States on social services — affordable housing,
education, access to healthy food, and so forth.

Bradley and Taylor
explain how this happens:

Several studies have demonstrated the health toll of living on the
streets; more than two-thirds of America’s homeless population suffer from
mental illness or substance dependency, while nearly half have at least one
additional chronic condition such as diabetes or hypertension. The high
costs of health care provided to people who are homeless have been well
documented. For instance, in one five-year period, 119 people who were
chronically homeless and tracked by the Boston Health Care for the Homeless
Program incurred a total of 18,834 emergency room visits estimated to cost
$12.7 million.

This makes sense. Many of our homeless patients deal with chronic diseases
like diabetes, mental illness, or congestive heart failure. We stabilize
them in the hospital and send them back to a shelter or the street. Often
they return the next week with exacerbation of their heart failure or
sky-high blood sugars or psychosis, even when medication is provided by the
hospital without charging the patient.

Thus a chasm separates our world and that of the poor, yet they are
entangled. How can you get someone to start eating vegetables and fruits
and whole grains in order to mitigate the effects of diabetes if they don’t
have money to buy these foods? How can you control a child’s asthma if a
family does not have money to clean their apartment and rid it of the
vermin, bugs, and dirt that pervade the nooks and crannies? How can you ensure a psychotic patient takes his medication when he can barely feed himself? The homeless
face a very different and more intimidating set of difficulties than the
wealthy. And these translate into challenges for physicians, who do
not have the time or skill to be both doctors and social workers.

We, as physicians, care for the patients until they are ready to leave the
hospital. Then they face their poverty on the street. Our view is but a
brief and skewed snapshot. In our myopic hospital world, the hospital
stretcher is detached from daily life. And this is necessarily so, to
a certain degree. Physicians can only do so much to fix societal ills — they cannot create a job, a safe home environment, or a loving family for
the patient.

Nevertheless, both wealthy patients and poor patients succumb to cancer,
strokes, and heart attacks. Both undergo the

humiliating process of death and dying
. In this sense, death and disease are often great equalizers. Neither the
poor nor the rich can escape them. They rapidly close the chasm between the
two classes. And at least in that vein, Orwell was right.

reasons for decline

Alex Reid

From a national perspective, the number of people earning communications degrees (which was negligible in the heyday of English majors 50-60 years ago), surpassed the number getting English degrees around 20 years ago. Since then Communications has held a fairly steady share of graduates as the college population grew, while English has lost its share and in recent years even shrank in total number, as this NCES table records. In short, students voted with their feet and, for the most part, they aren’t interested in the curricular experience English has to offer (i.e. read books, talk about books, write essays about books). 

Scott Alexander

Peterson is very conscious of his role as just another backwater stop on the railroad line of Western Culture. His favorite citations are Jung and Nietzsche, but he also likes name-dropping Dostoevsky, Plato, Solzhenitsyn, Milton, and Goethe. He interprets all of them as part of this grand project of determining how to live well, how to deal with the misery of existence and transmute it into something holy.
And on the one hand, of course they are. This is what every humanities scholar has been saying for centuries when asked to defend their intellectual turf. “The arts and humanities are there to teach you the meaning of life and how to live.” On the other hand, I’ve been in humanities classes. Dozens of them, really. They were never about that. They were about “explain how the depiction of whaling in Moby Dick sheds light on the economic transformations of the 19th century, giving three examples from the text. Ten pages, single spaced.” 

So maybe — just maybe — it’s not “read books, talk about books, write essays about books” that’s the problem. 

“not to waver with the wavering hours”

I’ve just been teaching Horace’s Epistles, and it strikes me that Horace ought to be the man of our social-media moment — the man who shows us another and better way.

In the first of those Epistles, Horace writes to his patron Maecenas — the one who bought him his Sabine farm that allows him to escape the noise and frenetic activity of Rome — to describe what he’s up to:

… my ambition to advance myself
In the sort of project that, if carried out
Successfully, is good for anyone,
Whether rich or poor, and its failure is bound to be
Harmful to anyone, whether he’s young or old. 

This “project” is, he says, to “devote myself entirely to the study / Of what is genuine and right for me, / Storing up what I learn for the sake of the future.” (I am quoting from David Ferry’s wonderful translation.) He needs to be on his farm to pursue this project, because life in the city, with its constant stimulation, creates too much agitation. And as he writes to another friend, Julius Florus (I.3), “if you’re able to learn to do without / Anxiety’s chilling effect, you’ll be able to follow / The lead of wisdom up to the highest reaches.”

Later (I.18) he exhorts Lollius Maximus to “interrogate the writings of the wise,”

Asking them to tell you how you can
Get through your life in a peaceable tranquil way.
Will it be greed, that always feels poverty-stricken,
That harasses and torments you all your days?
Will it be hope and fear about trivial things,
In anxious alternation in your mind?
Where is it virtue comes from, is it from books?
Or is it a gift from Nature that can’t be learned?
What is the way to become a friend to yourself?
What brings tranquility? What makes you care less?
Honor? Or money? Or living your life unnoticed?
Whenever I drink from the cold refreshing waters
Of the little brook Digentia, down below
Our local hill town, what do you think I pray for?
“May I continue to have what I have right now,
Or even less, as long as I’m self-sufficient.
If the gods should grant me life, though just for a while,
May I live my life to myself, with books to read,
And food to sustain me for another year,
And not to waver with the wavering hours.” 

The “wavering hours” waver because they’re charged with the nervous energy that comes from a too-busy life, a life of agitation and anxiety. As a youth Horace studied philosophy in Athens, and there he would have learned about the inestimable value of ataraxia — a peaceable and tranquil spirit. Because if you don’t have that, then you become a victim of your circumstances — and, especially in our time, a victim of propaganda.

Reading old books is a very valuable thing, because it takes you out of the maelstrom of “current events”; and it’s especially valuable to read old books like those by Horace because they will tell you quite directly how vital it is for you to learn this lesson.

propaganda and social media

Reading Ellul on the massive and pervasive consequences of propaganda in the twentieth century, I found myself over and over again thinking: This is how social media work on us. For instance, that passage I quoted in my earlier post — “to the same extent that he lives on the surface of events and makes today’s events his life by obliterating yesterday’s news, he refuses to see the contradictions in his own life and condemns himself to a life of successive moments, discontinuous and fragmented” — seems even more true as a description of the person constantly on Twitter and Facebook. Many other passages gave me the same feeling:

Man, eager for self-justification, throws himself in the direction of a propaganda that justifies him and this eliminates one of the sources of his anxiety. Propaganda dissolves contradictions and restores to man a unitary world in which the demands are in accord with the facts…. For all these reasons contenporary man needs propaganda; he asks for it; in fact, he almost instigates it. (159, 160) 

Or this:

Propaganda is concerned with the most pressing and at the same time the most elementary actuality. It proposes immediate action of the most ordinary kind. It thus plunges the individual into the most immediate present, taking from him all mastery of his life and all sense of the duration or continuity of any action or thought. Thus the propagandee becomes a man without a past and without a future, a man who receives from propaganda his portion of thought and action for the day; his discontinuous personality must be given continuity from the outside, and thus makes the need for propaganda very strong. (187) 

Thus the very common type of Twitter user who expresses himself or herself almost completely in hashtags: pre-established units of affiliation and exclusion.

And yet — Russian bots and political operatives (who have turned themselves into bots) aside — social media lack the planned purposefulness intrinsic to propaganda. So they must be a different kind of thing, yes?

Yes and no. I think what social media produce is emergent propaganda — propaganda that is not directed in any specific and conscious sense by anyone but rather emerges, arises, from vast masses of people who have been catechized within and by the same power-knowledge regime. Think also about the idea I got from an Adam Roberts novel: the hivemind singularity. Conscious, intentional propaganda is so twentieth century. The principalities and powers are far more sophisticated now. I’ll be thinking more about this.

"a revisionist blizzard of alternative theories"

Tim Adams on the media in Putin’s Russia:

In this culture war, disinformation was critical. Russian TV and social media would create a climate in which news became entertainment, and nothing would quite seem factual. This surreal shift is well documented, but Snyder’s forensic examination of, for example, the news cycle that followed the shooting down of flight MH17 makes essential reading. On the first day official propaganda suggested that the Russian missile attack on the Malaysian plane had in fact been a bodged attempt by Ukrainian forces to assassinate Putin himself; by day two, Russian TV was promoting the idea that the CIA had sent a ghost plane filled with corpses overhead to provoke Russian forces.

The more outrageous the official lie was, the more it allowed people to demonstrate their faith in the Kremlin. Putin made, Snyder argues, his direct assault on “western” factuality a source of national pride. Snyder calls this policy “implausible deniability”; you hear it in the tone of the current “debate” around the Salisbury attack: Russian power is displayed in a relativist blizzard of alternative theories, delivered in a vaguely absurdist spirit, as if no truth on earth is really provable.

Social-media propaganda directed at Americans works the same way: in contrast to earlier forms of propaganda, which sought to arouse people to action by alerting them to new and previously unseen truths, this kind of propaganda is meant to be soporific: it seeks to make people indifferent to what’s true, incurious, and accepting of whatever addresses the emotions to which they are most fully enslaved.

Long ago William Golding wrote a witty little essay called “Thinking as a Hobby” in which he identifies three levels of thought. Grade-three thinking, “more properly, is feeling, rather than thought”; it is ”full of unconscious prejudice, ignorance, and hypocrisy.” Grade-two thinking — which Golding came to practice as an adolescent — “is the detection of contradictions…. Grade-two thinkers do not stampede easily, though often they fall into the other fault and lag behind. Grade-two thinking is a withdrawal, with eyes and ears open.” Grade-two thinking is shouting “FAKE NEWS” and asking people whether they always believe what they’re told by the lamestream media, or pulling out your ink pad and rubber stamp and stamping BIGOT or RACIST on people who don’t line up with you 100%. I would say that such behavior is not “lagging behind” so much as digging in your heels and refusing to move — which herds of animals do far more frequently than they stampede.

When grade-two thinking is challenged its perpetrator will typically fall back to grade-three, as David French discovered: ”The desire to think the best of Mr. Trump combined with the deep distaste for Democrats grants extraordinary power to two phrases: ’fake news’ and ’the other side is worse.’

I’m reminded of an encounter at my church. People know that I opposed both Mr. Trump and Mrs. Clinton. They often ask what I think of the president’s performance. My standard response: I like some things, I dislike others, but I really wish he showed better character. I don’t want him to lie. I said this to a sweet older lady not long ago, and she responded — in all sincerity — “You mean Trump lies?” “Yes,” I replied. “All the time.” She didn’t answer with a defense. She didn’t say “fake news.” We’d known each other for years, and she trusted my words.

For a moment, she seemed troubled. I wanted to talk more — to say that we can appreciate and applaud the good things he does, but we can’t ignore his flaws, we can’t defend his sins, and we can’t let him define the future of the Republican Party. But just then, her jaw set. I saw a flare of defiance in her eyes. She took a sip of coffee, looked straight at me, and I knew exactly what was coming next: “Well, the Democrats are worse.”

Jacques Ellul argued half-a-century ago that the purpose of propaganda is to “provide immediate incentives to action.” But propaganda that encourages us to dig in our heels, or just drift with the social-media current, is propaganda all the same. What remains absolutely essential from Ellul’s book is his understanding that the person “embroiled in the conflicts of his time” (49) is most vulnerable to propaganda — and he could not have imagined a society so locked into the current instant as we denizens of Social Media World are. I’m going to close this post with a long quotation from Ellul that was incisive in relation to his own time but is devastatingly accurate about ours. I’ve put some especially important passages in bold; and I’d like you to notice how Ellul anticipates Debord’s Society of the Spectacle. Here goes:

To the extent that propaganda is based on current news, it cannot permit time for thought or reflection. A man caught up in the news must remain on the surface of the event; be is carried along in the current, and can at no time take a respite to judge and appreciate; he can never stop to reflect. There is never any awareness — of himself, of his condition, of his society — for the man who lives by current events. Such a man never stops to investigate any one point, any more than he will tie together a series of news events. We already have mentioned man’s inability to consider several facts or events simultaneously and to make a synthesis of them in order to face or to oppose them. One thought drives away another; old facts are chased by new ones. Under these conditions there can be no thought. And, in fact, modern man does not think about current problems; he feels them. He reacts, but be does not understand them any more than he takes responsibility for them. He is even less capable of spotting any inconsistency between successive facts; man’s capacity to forget is unlimited. This is one of the most important and useful points for the propagandist, who can always be sure that a particular propaganda theme, statement, or event will be forgotten within a few weeks. Moreover, there is a spontaneous defensive reaction in the individual against an excess of information and — to the extent that he clings (unconsciously) to the unity of his own person — against inconsistencies. The best defense here is to forget the preceding event. In so doing, man denies his own continuity; to the same extent that he lives on the surface of events and makes today’s events his life by obliterating yesterday’s news, he refuses to see the contradictions in his own life and condemns himself to a life of successive moments, discontinuous and fragmented.

This situation makes the “current-events man” a ready target for propaganda. Indeed, such a man is highly sensitive to the influence of present-day currents; lacking landmarks, he follows all currents. He is unstable because he runs after what happened today; he relates to the event, and therefore cannot resist any impulse coming from that event. Because he is immersed in current affairs, this man has a psychological weakness that puts him at the mercy of the propagandist. No confrontation ever occurs between the event and the truth; no relationship ever exists between the event and the person. Real information never concerns such a person. What could be more striking, more distressing, more decisive than the splitting of the atom, apart from the bomb itself? And yet this great development is kept in the background, behind the fleeting and spectacular result of some catastrophe or sports event because that is the superficial news the average man wants. Propaganda addresses itself to that man; like him, it can relate only to the most superficial aspect of a spectacular event, which alone can interest man and lead him to make a certain decision or adopt a certain attitude. (46-47)

Maybe I should blog a read-through of Propaganda.