Friday, June 9, 2023

How the Internet Gets Inside Us (2011)

Books explaining why books no longer matter come in many flavors.Illustration by CHRISTOPH NIEMANN

When the first Harry Potter book appeared, in 1997, it was just a year before the universal search engine Google was launched. And so Hermione Granger, that charming grind, still goes to the Hogwarts library and spends hours and hours working her way through the stacks, finding out what a basilisk is or how to make a love potion. The idea that a wizard in training might have, instead, a magic pad where she could inscribe a name and in half a second have an avalanche of news stories, scholarly articles, books, and images (including images she shouldn’t be looking at) was a Quidditch broom too far. Now, having been stuck with the library shtick, she has to go on working the stacks in the Harry Potter movies, while the kids who have since come of age nudge their parents. “Why is she doing that?” they whisper. “Why doesn’t she just Google it?”

That the reality of machines can outpace the imagination of magic, and in so short a time, does tend to lend weight to the claim that the technological shifts in communication we’re living with are unprecedented. It isn’t just that we’ve lived one technological revolution among many; it’s that our technological revolution is the big social revolution that we live with. The past twenty years have seen a revolution less in morals, which have remained mostly static, than in means: you could already say “fuck” on HBO back in the eighties; the change has been our ability to tweet or IM or text it. The set subject of our novelists is information; the set obsession of our dons is what it does to our intelligence.

The scale of the transformation is such that an ever-expanding literature has emerged to censure or celebrate it. A series of books explaining why books no longer matter is a paradox that Chesterton would have found implausible, yet there they are, and they come in the typical flavors: the eulogistic, the alarmed, the sober, and the gleeful. When the electric toaster was invented, there were, no doubt, books that said that the toaster would open up horizons for breakfast undreamed of in the days of burning bread over an open flame; books that told you that the toaster would bring an end to the days of creative breakfast, since our children, growing up with uniformly sliced bread, made to fit a single opening, would never know what a loaf of their own was like; and books that told you that sometimes the toaster would make breakfast better and sometimes it would make breakfast worse, and that the cost for finding this out would be the price of the book you’d just bought.

All three kinds appear among the new books about the Internet: call them the Never-Betters, the Better-Nevers, and the Ever-Wasers. The Never-Betters believe that we’re on the brink of a new utopia, where information will be free and democratic, news will be made from the bottom up, love will reign, and cookies will bake themselves. The Better-Nevers think that we would have been better off if the whole thing had never happened, that the world that is coming to an end is superior to the one that is taking its place, and that, at a minimum, books and magazines create private space for minds in ways that twenty-second bursts of information don’t. The Ever-Wasers insist that at any moment in modernity something like this is going on, and that a new way of organizing data and connecting users is always thrilling to some and chilling to others—that something like this is going on is exactly what makes it a modern moment. One’s hopes rest with the Never-Betters; one’s head with the Ever-Wasers; and one’s heart? Well, twenty or so books in, one’s heart tends to move toward the Better-Nevers, and then bounce back toward someplace that looks more like home.

Among the Never-Betters, the N.Y.U. professor Clay Shirky—the author of “Cognitive Surplus” and many articles and blog posts proclaiming the coming of the digital millennium—is the breeziest and seemingly most self-confident. “Seemingly,” because there is an element of overdone provocation in his stuff (So people aren’t reading Tolstoy? Well, Tolstoy sucks) that suggests something a little nervous going on underneath. Shirky believes that we are on the crest of an ever-surging wave of democratized information: the Gutenberg printing press produced the Reformation, which produced the Scientific Revolution, which produced the Enlightenment, which produced the Internet, each move more liberating than the one before. Though it may take a little time, the new connective technology, by joining people together in new communities and in new ways, is bound to make for more freedom. It’s the Wired version of Whig history: ever better, onward and upward, progress unstopped. In John Brockman’s anthology “Is the Internet Changing the Way You Think?,” the evolutionary psychologist John Tooby shares the excitement—“We see all around us transformations in the making that will rival or exceed the printing revolution”—and makes the same extended parallel to Gutenberg: “Printing ignited the previously wasted intellectual potential of huge segments of the population. . . . Freedom of thought and speech—where they exist—were unforeseen offspring of the printing press.”

Shirky’s and Tooby’s version of Never-Betterism has its excitements, but the history it uses seems to have been taken from the back of a cereal box. The idea, for instance, that the printing press rapidly gave birth to a new order of information, democratic and bottom-up, is a cruel cartoon of the truth. If the printing press did propel the Reformation, one of the biggest ideas it propelled was Luther’s newly invented absolutist anti-Semitism. And what followed the Reformation wasn’t the Enlightenment, a new era of openness and freely disseminated knowledge. What followed the Reformation was, actually, the Counter-Reformation, which used the same means—i.e., printed books—to spread ideas about what jerks the reformers were, and unleashed a hundred years of religious warfare. In the seventeen-fifties, more than two centuries later, Voltaire was still writing in a book about the horrors of those other books that urged burning men alive in auto-da-fé. Buried in Tooby’s little parenthetical—“where they exist”—are millions of human bodies. If ideas of democracy and freedom emerged at the end of the printing-press era, it wasn’t by some technological logic but because of parallel inventions, like the ideas of limited government and religious tolerance, very hard won from history.

Of course, if you stretch out the time scale enough, and are sufficiently casual about causes, you can give the printing press credit for anything you like. But all the media of modern consciousness—from the printing press to radio and the movies—were used just as readily by authoritarian reactionaries, and then by modern totalitarians, to reduce liberty and enforce conformity as they ever were by libertarians to expand it. As Andrew Pettegree shows in his fine new study, “The Book in the Renaissance,” the mainstay of the printing revolution in seventeenth-century Europe was not dissident pamphlets but royal edicts, printed by the thousand: almost all the new media of that day were working, in essence, for kinglouis.gov.

Even later, full-fledged totalitarian societies didn’t burn books. They burned some books, while keeping the printing presses running off such quantities that by the mid-fifties Stalin was said to have more books in print than Agatha Christie. (Recall that in “1984” Winston’s girlfriend works for the Big Brother publishing house.) If you’re going to give the printed book, or any other machine-made thing, credit for all the good things that have happened, you have to hold it accountable for the bad stuff, too. The Internet may make for more freedom a hundred years from now, but there’s no historical law that says it has to.

Many of the more knowing Never-Betters turn for cheer not to messy history and mixed-up politics but to psychology—to the actual expansion of our minds. The argument, advanced in Andy Clark’s “Supersizing the Mind” and in Robert K. Logan’s “The Sixth Language,” begins with the claim that cognition is not a little processing program that takes place inside your head, Robby the Robot style. It is a constant flow of information, memory, plans, and physical movements, in which as much thinking goes on out there as in here. If television produced the global village, the Internet produces the global psyche: everyone keyed in like a neuron, so that to the eyes of a watching Martian we are really part of a single planetary brain. Contraptions don’t change consciousness; contraptions are part of consciousness. We may not act better than we used to, but we sure think differently than we did.

Cognitive entanglement, after all, is the rule of life. My memories and my wife’s intermingle. When I can’t recall a name or a date, I don’t look it up; I just ask her. Our machines, in this way, become our substitute spouses and plug-in companions. Jerry Seinfeld said that the public library was everyone’s pathetic friend, giving up its books at a casual request and asking you only to please return them in a month or so. Google is really the world’s Thurber wife: smiling patiently and smugly as she explains what the difference is between eulogy and elegy and what the best route is to that little diner outside Hackensack. The new age is one in which we have a know-it-all spouse at our fingertips.

But, if cognitive entanglement exists, so does cognitive exasperation. Husbands and wives deny each other’s memories as much as they depend on them. That’s fine until it really counts (say, in divorce court). In a practical, immediate way, one sees the limits of the so-called “extended mind” clearly in the mob-made Wikipedia, the perfect product of that new vast, supersized cognition: when there’s easy agreement, it’s fine, and when there’s widespread disagreement on values or facts, as with, say, the origins of capitalism, it’s fine, too; you get both sides. The trouble comes when one side is right and the other side is wrong and doesn’t know it. The Shakespeare authorship page and the Shroud of Turin page are scenes of constant conflict and are packed with unreliable information. Creationists crowd cyberspace every bit as effectively as evolutionists, and extend their minds just as fully. Our trouble is not the over-all absence of smartness but the intractable power of pure stupidity, and no machine, or mind, seems extended enough to cure that.

The books by the Better-Nevers are more moving than those by the Never-Betters for the same reason that Thomas Gray was at his best in that graveyard: loss is always the great poetic subject. Nicholas Carr, in “The Shallows,” William Powers, in “Hamlet’s BlackBerry,” and Sherry Turkle, in “Alone Together,” all bear intimate witness to a sense that the newfound land, the ever-present BlackBerry-and-instant-message world, is one whose price, paid in frayed nerves and lost reading hours and broken attention, is hardly worth the gains it gives us. “The medium does matter,” Carr has written. “As a technology, a book focuses our attention, isolates us from the myriad distractions that fill our everyday lives. A networked computer does precisely the opposite. It is designed to scatter our attention. . . . Knowing that the depth of our thought is tied directly to the intensity of our attentiveness, it’s hard not to conclude that as we adapt to the intellectual environment of the Net our thinking becomes shallower.”

These three Better-Nevers have slightly different stories to tell. Carr is most concerned about the way the Internet breaks down our capacity for reflective thought. His testimony about how this happened in his own life is plangent and familiar, but he addles it a bit by insisting that the real damage is being done at the neurological level, that our children are having their brains altered by too much instant messaging and the like. This sounds impressive but turns out to be redundant. Of course the changes are in their brains; where else would they be? It’s the equivalent of saying that playing football doesn’t just affect a kid’s fitness; it changes the muscle tone that creates his ability to throw and catch footballs.

Powers’s reflections are more family-centered and practical. He recounts, very touchingly, stories of family life broken up by the eternal consultation of smartphones and computer monitors:

Somebody excuses themselves for a bathroom visit or a glass of water and doesn’t return. Five minutes later, another of us exits on a similarly mundane excuse along the lines of “I have to check something.”. . . Where have all the humans gone? To their screens of course. Where they always go these days. The digital crowd has a way of elbowing its way into everything, to the point where a family can’t sit in a room together for half an hour without somebody, or everybody, peeling off. . . . As I watched the Vanishing Family Trick unfold, and played my own part in it, I sometimes felt as if love itself, or the acts of heart and mind that constitute love, were being leached out of the house by our screens.

He then surveys seven Wise Men—Plato, Thoreau, Seneca, the usual gang—who have something to tell us about solitude and the virtues of inner space, all of it sound enough, though he tends to overlook the significant point that these worthies were not entirely in favor of the kinds of liberties that we now take for granted and that made the new dispensation possible. (He knows that Seneca instructed the Emperor Nero, but sticks in a footnote to insist that the bad, fiddling-while-Rome-burned Nero asserted himself only after he fired the philosopher and started to act like an Internet addict.)

“There, now­—are we ready to discuss this calmly?”

Similarly, Nicholas Carr cites Martin Heidegger for having seen, in the mid-fifties, that new technologies would break the meditational space on which Western wisdoms depend. Since Heidegger had not long before walked straight out of his own meditational space into the arms of the Nazis, it’s hard to have much nostalgia for this version of the past. One feels the same doubts when Sherry Turkle, in “Alone Together,” her touching plaint about the destruction of the old intimacy-reading culture by the new remote-connection-Internet culture, cites studies that show a dramatic decline in empathy among college students, who apparently are “far less likely to say that it is valuable to put oneself in the place of others or to try and understand their feelings.” What is to be done? Other Better-Nevers point to research that’s supposed to show that people who read novels develop exceptional empathy. But if reading a lot of novels gave you exceptional empathy university English departments should be filled with the most compassionate and generous-minded of souls, and, so far, they are not.

One of the things that John Brockman’s collection on the Internet and the mind illustrates is that when people struggle to describe the state that the Internet puts them in they arrive at a remarkably familiar picture of disassociation and fragmentation. Life was once whole, continuous, stable; now it is fragmented, multi-part, shimmering around us, unstable and impossible to fix. The world becomes Keats’s “waking dream,” as the writer Kevin Kelly puts it.

The odd thing is that this complaint, though deeply felt by our contemporary Better-Nevers, is identical to Baudelaire’s perception about modern Paris in 1855, or Walter Benjamin’s about Berlin in 1930, or Marshall McLuhan’s in the face of three-channel television (and Canadian television, at that) in 1965. When department stores had Christmas windows with clockwork puppets, the world was going to pieces; when the city streets were filled with horse-drawn carriages running by bright-colored posters, you could no longer tell the real from the simulated; when people were listening to shellac 78s and looking at color newspaper supplements, the world had become a kaleidoscope of disassociated imagery; and when the broadcast air was filled with droning black-and-white images of men in suits reading news, all of life had become indistinguishable from your fantasies of it. It was Marx, not Steve Jobs, who said that the character of modern life is that everything falls apart.

We must, at some level, need this to be true, since we think it’s true about so many different kinds of things. We experience this sense of fracture so deeply that we ascribe it to machines that, viewed with retrospective detachment, don’t seem remotely capable of producing it. If all you have is a hammer, the saying goes, everything looks like a nail; and, if you think the world is broken, every machine looks like the hammer that broke it.

It is an intuition of this kind that moves the final school, the Ever-Wasers, when they consider the new digital age. A sense of vertiginous overload is the central experience of modernity, they say; at every moment, machines make new circuits for connection and circulation, as obvious-seeming as the postage stamps that let nineteenth-century scientists collaborate by mail, or as newfangled as the Wi-Fi connection that lets a sixteen-year-old in New York consult a tutor in Bangalore. Our new confusion is just the same old confusion.

Among Ever-Wasers, the Harvard historian Ann Blair may be the most ambitious. In her book “Too Much to Know: Managing Scholarly Information Before the Modern Age,” she makes the case that what we’re going through is like what others went through a very long while ago. Against the cartoon history of Shirky or Tooby, Blair argues that the sense of “information overload” was not the consequence of Gutenberg but already in place before printing began. She wants us to resist “trying to reduce the complex causal nexus behind the transition from Renaissance to Enlightenment to the impact of a technology or any particular set of ideas.” Anyway, the crucial revolution was not of print but of paper: “During the later Middle Ages a staggering growth in the production of manuscripts, facilitated by the use of paper, accompanied a great expansion of readers outside the monastic and scholastic contexts.” For that matter, our minds were altered less by books than by index slips. Activities that seem quite twenty-first century, she shows, began when people cut and pasted from one manuscript to another; made aggregated news in compendiums; passed around précis. “Early modern finding devices” were forced into existence: lists of authorities, lists of headings.



from Hacker News https://ift.tt/c96D5AO

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.