Solnit’s nostalgia

Rebecca Solnit writes,

Those mail and newspaper deliveries punctuated the day like church bells. You read the paper over breakfast. If there were developments you heard about them on the evening news or in the next day’s paper. You listened to the news when it was broadcast, since there was no other way to hear it. A great many people relied on the same sources of news, so when they discussed current events they did it under the overarching sky of the same general reality. Time passed in fairly large units, or at least not in milliseconds and constant updates. A few hours wasn’t such a long time to go between moments of contact with your work, your people or your trivia.

You opened the mail when you came home from work, or when it arrived if you worked from home. Some of the mail was important and personal, not just bills. It was exciting to get a letter: the paper and handwriting told you something, as well as the words….

Previous technologies have expanded communication. But the last round may be contracting it. The eloquence of letters has turned into the unnuanced spareness of texts; the intimacy of phone conversations has turned into the missed signals of mobile phone chat. I think of that lost world, the way we lived before these new networking technologies, as having two poles: solitude and communion. The new chatter puts us somewhere in between, assuaging fears of being alone without risking real connection. It is a shallow between two deep zones, a safe spot between the dangers of contact with ourselves, with others.

Solnit is one of the finest writers of her generation, so it’s a bit sad to see her recycling these tired complaints. Even if every word of her essay is true, it has been said thousands of times already. Sven Birkerts got it all into The Gutenberg Elegies in 1994, and since then people have just been doodling variations on his themes.

But here’s the problem I have with all screeds of this particular type. If you happen to be old enough to remember the days of letter-writing that Solnit limns so nostalgically, I invite you to perform the following thought-experiment:

  • Estimate the number of letters you wrote in a given year.
  • Estimate the number of letters you meant to write, planned to write, knew you ought to write, and yet never quite got around to writing.
  • Calculate the ratio of those numbers.

In Solnit’s imagination, every brief email or telegraphic text we write today would thirty years ago or more have been a letter. But a moment’s reflection shows that that’s not true. People send emails who never would have gotten around to writing letters or even making phone calls; people (mostly younger ones) who find email too frictiony a medium might send a hundred texts a day. If we’re going to understand how these technologies are changing us, we need to make the right comparisons: not one long hand-written letter to one brief email, but one long hand-written letter to several emails, or dozens of texts exchanged with multiple people in a given day.

An average twenty-year-old today writes far, far more to his or her friends than the average twenty-year-old of any time in human history. His or her experience is remarkable primarily for how textual it is, how many written words comprise it. We should start by acknowledging that fact, and if we go on to form a critique, we should have a clearer-eyed view of the past as well.

All that said, there are some good points about distraction and the alternatives to distraction in Solnit’s essay; I’ll try to write about those another time. But the nostalgia here is really problematic.

man in an iFugue

Gary Shteyngart:

“This right here,” said the curly-haired, 20-something Apple Store glam-nerd who sold me my latest iPhone, “is the most important purchase you will ever make in your life.” He looked at me, trying to gauge whether the holiness of this moment had registered as he passed me the Eucharist with two firm, unblemished hands. “For real?” I said, trying to sound like a teenager, trying to mimic what all these devices and social media are trying to do, which is to restore in us the feelings of youth and control.“For real,” he said. And he was right. The device came out of the box and my world was transformed. I walked outside my book-ridden apartment. The first thing that happened was that New York fell away around me. It disappeared. Poof. The city I had tried to set to the page in three novels and counting, the hideously outmoded boulevardier aspect of noticing societal change in the gray asphalt prism of Manhattan’s eye, noticing how the clothes are draping the leg this season, how backsides are getting smaller above 59th Street and larger east of the Bowery, how the singsong of the city is turning slightly less Albanian on this corner and slightly more Fujianese on this one — all of it, finished. Now, an arrow threads its way up my colorful screen. The taco I hunger for is 1.3 miles away, 32 minutes of walking or 14 minutes if I manage to catch the F train. I follow the arrow taco-ward, staring at my iPhone the way I once glanced at humanity, with interest and anticipation.

Steven Johnson’s numbers game

Unfortunately, Steven Johnson, once one of the sharpest cultural commentators around, seems to be turning into a caricature. His recent response to the concerns about digital life articulated by Nicholas Carr and others is woefully bad. He simply refuses to take seriously the increasingly large body of evidence about the negative consequences of always-on always-online so-called multitasking. Yes, “multitasking makes you slightly less able to focus,” or, as he later says, “I am slightly less focused,” or, still later, “we are a little less focused.” (Am I allowed to make a joke here about how multitasking makes you less likely to notice repetitions in your prose?)But what counts as a “little less”? Choosing to refer only to one of the less alarming of the many studies available, Johnson reports that it “found that heavy multitaskers performed about 10 to 20 percent worse on most tests than light multitaskers.” Apparently for Johnson losing 20% of your ability to concentrate is scarcely worth mentioning. And apparently he hasn’t seen any of the studies showing that people who are supremely confident in their multitasking abilities, as he appears to be, are more fuddled than anyone else.Johnson wants us to focus on the fabulous benefits we receive from a multitasking life. For instance,

Thanks to e-mail, Twitter and the blogosphere, I regularly exchange information with hundreds of people in a single day: scheduling meetings, sharing political gossip, trading edits on a book chapter, planning a family vacation, reading tech punditry. How many of those exchanges could happen were I limited exclusively to the technologies of the phone, the post office and the face-to-face meeting? I suspect that the number would be a small fraction of my current rate.

And then, later: “We are reading more text, writing far more often, than we were in the heyday of television.” So it would appear that Johnson has no concept whatsoever of quality of interaction — he thinks only in terms of quantity. How much we read, how much we write, how many messages we exchange in a day.That’s it? That’s all? Just racking up the numbers, like counting your Facebook friends or Twitter followers? Surely Johnson can do better than this. I have my own concerns about Carr’s arguments, some of which I have tried to articulate here, but the detailed case he makes for the costs of connection deserves a far more considered response than Johnson is prepared to give it.I think the Steven Johnson of a few years ago would have realized the need to make a much stronger — and probably a wholly different — case for the distracted life than this sad little counting game. He should get offline for a few weeks and think about all this some more.

Quick Links: Singularity University, Neuro-Trash, and more

• Imagine the frat parties: Ted Greenwald, a senior editor of the print edition of Wired magazine, has been attending and covering Singularity University for Wired.com. We’ll have more on this in the days ahead. Meanwhile, Nick Carr suggests some mascots for Singularity U.

• Squishy but necessary: Last month, Athena Andreadis, the author of the book The Biology of Star Trek, had a piece in H+ Magazine throwing cold water on some visions of brain uploading and downloading. Money quote: “It came to me in a flash that many transhumanists are uncomfortable with biology and would rather bypass it altogether for two reasons…. The first is that biological systems are squishy — they exude blood, sweat and tears, which are deemed proper only for women and weaklings. The second is that, unlike silicon systems, biological software is inseparable from hardware. And therein lies the major stumbling block to personal immortality.”

• Thanks, guys: We’re pleased to have fans over at the “Fight Aging” website, where they say we “write well.” The praise warms our hearts, it truly does. We only wish that those guys were capable of reading well. Their post elicited this response from our Futurisms coauthor Charles Rubin: “Questioning what look to us to be harebrained ideas of progress does not make us ‘against progress.’ Nor does skepticism about ill-considered notions of the benefits of immortality make us ‘for suffering’ or ‘pro-death.’ It may be that the transhumanists really cannot grasp those distinctions, perhaps because of their apparently absolute (yet completely unjustified) confidence in their ability to foretell the future. Only if they have a reliable crystal ball — if they can know with certainty that their vision of the future will come to pass — does opposition to their vision of progress make us ‘anti-progress’ and does acknowledging the consequences of mortality make us ‘pro-death.’” Indeed. And I might add that such confidence in unproven predictive powers seems less like the rationality transhumanists claim to espouse than like uncritical faith.

• A sporting chance: Gizmodo has an essay by Aimee Mullins — an actress, model, former athlete, and double amputee — about technology, disability, and competition. Her key argument: “Advantage is just something that is part of sports. No athletes are created equal. They simply aren’t, due to a multitude of factors including geography, access to training, facilities, health care, injury prevention, and sure, technology.” Mullins concedes that it might be appropriate to keep certain technological enhancements out of sport, but she is “not sure” where to draw the line, and she advises not making any decisions about technologies before they actually exist.

• On ‘Neuro-Trash’: A remarkable essay in the New Humanist by Raymond Tallis on the abuse of brain research. Tallis starts off by describing how neuroscience is being applied to ever more aspects of human affairs. “This might be regarded as harmless nonsense, were it not for the fact that it is increasingly being suggested … that we should use the findings of neurosciences to guide policymakers. The return of political scientism, particularly of a biological variety, should strike a chill in the heart.” Beneath this trend, Tallis writes, lies the incorrect “fundamental assumption” that “we are our brains.” (Vaughan over at MindHacks describes Tallis’s essay as “barnstorming and somewhat bad-tempered.” Readers looking for more along these lines might also enjoy our friend Matt Crawford’s New Atlantis essay on “The Limits of Neuro-Talk.”)

• Calling Ringling Bros.: We’ve known for a long time that people talking on cell phones get so distracted that they can become oblivious to what’s physically around them — entering a state sometimes called “absent presence.” In the October issue of Applied Cognitive Psychology, a team of researchers from Western Washington University reported the results of an experiment observing and interviewing pedestrians to see if they noticed a nearby clown wearing “a vivid purple and yellow outfit, large shoes, and a bright red nose” as he rode a bicycle. As you would expect, cell phone users were pretty oblivious. Does this suggest that we’ll suffer from increasing “inattentional blindness” as we are bombarded with ever more stimuli from increasingly ubiquitous gadgets? Not necessarily: it turns out that pedestrians listening to music tended to notice the clown more than those walking in silence. The cohort likeliest to see the clown consisted of people walking in pairs.

• Metaphor creep: “If the brain is like a set of computers that control different tasks,” says an SFSU psychology professor, then “consciousness is the Wi-Fi network that allows different parts of the brain to talk to each other and decide which action ‘wins’ and is carried out.”

• Another kind of ‘Futurism’: This year marks the centenary of the international Futurist art movement. The 1909 Futurist Manifesto that kicked it all off is explicitly violent and even sexist in its aims (“we want to exalt movements of aggression, feverish sleeplessness, the double march, the perilous leap, the slap and the blow with the fist … we want to glorify war — the only cure for the world…”) and critical of any conservative institutions (professors and antiquaries are called “gangrene”; museums, libraries, and academies are called “cemeteries of wasted effort, calvaries of crucified dreams, registers of false starts”). Central to the Futurist vision was a love of new technologies — and of all the speed, noise, and violence of the machine age.

escaping

It’s interesting that the NYT today ran one story about people who refuse to have cellphones and another story about people who want to escape being always connected to the Internet. This meme has been building for some time, but I wonder if the curve is about to turn more sharply upward. Still more, I wonder whether it will amount to anything more than kvetching. Just as everyone talks about the weather but no one does anything about it, I think we’ll find that everyone will be complaining about the frustrations of being always connected but hardly anyone will actually disconnect.

My pledge: as long as I’m connected, and enjoying the benefits of online life, I’m not going to bitch about it.

the dangers of focus?

Sam Anderson’s argument that “unwavering focus . . . can actually be just as problematic as ADHD” is the conclusion that follows from this paragraph:

My favorite focusing exercise comes from William James: Draw a dot on a piece of paper, then pay attention to it for as long as you can. (Sitting in my office one afternoon, with my monkey mind swinging busily across the lush rain forest of online distractions, I tried this with the closest dot in the vicinity: the bright-red mouse-nipple at the center of my laptop’s keyboard. I managed to stare at it for 30 minutes, with mixed results.) James argued that the human mind can’t actually focus on the dot, or any unchanging object, for more than a few seconds at a time: It’s too hungry for variety, surprise, the adventure of the unknown. It has to refresh its attention by continually finding new aspects of the dot to focus on: subtleties of its shape, its relationship to the edges of the paper, metaphorical associations (a fly, an eye, a hole). The exercise becomes a question less of pure unwavering focus than of your ability to organize distractions around a central point. The dot, in other words, becomes only the hub of your total dot-related distraction.

This is wrong-headed in a number of ways, but chief among them is this: there’s no good reason for focusing on a dot. The mind trying to focus on a dot gets impatient because a dot is neither interesting nor complex. The artistic achievements mentioned elsewhere in the essay — Proust’s In Search of Lost Time, John Lennon’s songs — were the product of intense focus on complex and multivalent tasks. Indeed, that’s why focus is so important for artists and other intellectual workers — and for that matter dancers and surgeons and bomb defusers: complex tasks demand a great deal of attention and if we lose any of that attention we do those jobs less well. (But at least the novelist can go back and fix things later; this is harder for the surgeon and especially for the bomb defuser.) Besides, how many people are really in danger of “unwavering focus”? Many years ago I read a magazine profile of the chess champion Bobby Fischer in which the writer described an interview session they had over lunch. At one point the writer asked a question just as Fischer was lifting some food to his mouth, and the grandmaster ended up poking himself in the cheek with his fork. He simply could not do two things at once — a social problem, to be sure, but almost certainly a boon to his attentiveness to the chessboard. Has that ever happened to you? Have you ever been so distracted by a question that you speared your cheek with a fork? If not, then you probably don't need to worry about the dangers of being too focused.

quote of the day

“The Internet is basically a Skinner box engineered to tap right into our deepest mechanisms of addiction.” — Sam Anderson Anderson continues:

As B. F. Skinner’s army of lever-pressing rats and pigeons taught us, the most irresistible reward schedule is not, counterintuitively, the one in which we’re rewarded constantly but something called “variable ratio schedule,” in which the rewards arrive at random. And that randomness is practically the Internet’s defining feature: It dispenses its never-ending little shots of positivity—a life-changing e-mail here, a funny YouTube video there—in gloriously unpredictable cycles. It seems unrealistic to expect people to spend all day clicking reward bars—searching the web, scanning the relevant blogs, checking e-mail to see if a co-worker has updated a project—and then just leave those distractions behind, as soon as they’re not strictly required, to engage in “healthy” things like books and ab crunches and undistracted deep conversations with neighbors. It would be like requiring employees to take a few hits of opium throughout the day, then being surprised when it becomes a problem. Last year, an editorial in the American Journal of Psychiatry raised the prospect of adding “Internet addiction” to the DSM, which would make it a disorder to be taken as seriously as schizophrenia.

Don't be ridiculous. I’m not an addict — I can quit any time I want. Near the end of his essay, Anderson makes the argument that “Focus is a paradox — it has distraction built into it. The two are symbiotic; they’re the systole and diastole of consciousness. . . . The truly wise mind will harness, rather than abandon, the power of distraction. Unwavering focus — the inability to be distracted — can actually be just as problematic as ADHD.” He is apparently unaware how much this sounds like pure wishful thinking. (Maybe he was too distracted as he wrote it.) But there’s something to the argument all the same; I hope to be able to say more about that later.

freedom

From the website : "Freedom is an application that disables networking on an Apple computer for up to eight hours at a time. Freedom will free you from the distractions of the internet, allowing you time to code, write, or create. At the end of your selected offline period, Freedom re-enables your network, restoring everything as normal. "Freedom enforces freedom; a reboot is the only circumvention of the Freedom time limit you specify. The hassle of rebooting means you're less likely to cheat, and you'll be more productive. When first getting used to Freedom, I suggest using the software for short periods of time."Do I dare to use it?