Saturday, May 9, 2020

Our Bookless Future


Neuroscientist Maryanne Wolf had a surprise hit a dozen years ago, Proust and the Squid: The Story and Science of the Reading Brain (2007), a study of literacy’s role in the development of human cognition. But as she wrote the final sections, she realized the book had already become dated. The Digital Revolution had happened, and she was too buried in Sumerian scripts and Greek alphabets to notice. She felt like Rip van Winkle, she admits in her new book, Reader, Come Home, comprising nine companionable letters addressed to anybody interested in the value of reading. Here, Wolf uses the tools of neuroscience to examine what happened to reading in that transition from old print to new screens—“how the circuitry of the reading brain would be altered by the unique characteristics of the digital medium, particularly in the young.” Her focus is neither the reading mind, nor our tastes, knowledge, intelligence, or skills, but the physical organ inside our heads. Those other things are shaped by what our brains are able and disposed to do.

* * *

Wolf begins with a genetic fact: “human beings were never born to read.” Literacy is an epigenetic achievement, extending our biological capacities for vision and language into a new “circuitry” that performs wondrous feats—not only the creation of masterpieces such as Rainer Maria Rilke’s Duino Elegies, which grabbed Wolf when she was young, but the capacity to imagine other selves and worlds, follow complex arguments, and acquire and store knowledge. She calls it “an unnatural cultural invention,” but it did more than transform oral cultures into print cultures. Literacy altered the human brain, making it “refit some of its existing neuronal groups” and “form newly recycled circuits.” The brain had to change because the innate brain can’t read. It responds to what it is exposed to if exposure happens often, for a long period. Literacy develops through practice—through labor that compels the development of revised brain functions. The more you read, the more your brain adapts. It is a “plastic” organ.

What follows Wolf’s opening statements is a tour de force description of the physiology of reading. She breaks down the seemingly simple process of understanding letters, words, and sentences into its cognitive pieces, tracking the course of reading through different lobes (frontal, temporal, parietal, and occipital) and layers (uppermost telencephalon, etc.). Her presentation can’t be condensed here because the steps are so numerous and complex. One paragraph on how a reader begins to apprehend a single word through “spotlights of attention” gives an idea of this complexity:

The first spotlights, which do the work of the orienting attentional system, have three quickly accomplished jobs. First, they help us disengage from whatever we were originally attending to—which takes place in the parietal lobe of our cortex (i.e., in the telencephalon’s uppermost layer). Second, they help us move our attention to whatever is in front of us…. This act of moving our visual attention takes place deep in our midbrain (i.e., in the mesencephalon, or third layer). Third, they help focus our new attention and, in so doing, alert the entire reading circuit to prepare for action. This last pre-reading focusing of attention takes place in a special area below the cortex that functions as one of the brain’s major switchboards: the very important thalamus, which resides in the diencephalon, or second layer, of each hemisphere.

And that’s what happens before any proper reading can begin! Keep in mind, too, that this particular spotlight of attention is a primary step in an “entire reading circuit.” It wouldn’t happen if the brain were not anticipating the reading of a word.

As the brain proceeds to deeper cognitions of written words, it interprets the content as if the reader were actually experiencing it. The reading brain and the print read become so intertwined that when one reads the scene in which Anna Karenina leaps to her death, “you leaped, too.” “In all likelihood,” brain scanning shows, “the same neurons you deploy when you move your legs and trunk were also activated when you read that Anna jumped before the train.” Wolf takes this as proof that reading fosters empathy and imagination, two marvelous consequences of literacy. Reading also cultivates “cognitive patience,” necessary to the critical intellect as it struggles against “novelty bias”—the human preference for new phenomena because they are new. Reading activates background knowledge, too: the process of comprehension combines what we are reading with the knowledge we already have that is somehow related to the matter at hand, thus keeping that knowledge alive in us.

The very inventedness of literacy, however, along with the neuroplasticity of the brain, troubles Wolf in the second half of Reader, Come Home. If reading is not natural but invented, it can deteriorate. If the brain adapted to print because of repeated exposure, it can adapt away if exposure slows. The circuit will break if unused, or if something different from print draws more of the brain’s attention. If screens take the place of paper, the brain will react.

“What Will Become of the Readers We Have Been?” she asks in Letter Four, which rehearses oft-noted drawbacks of the screen habitat. Hyperstimulation is one; “continuous partial attention,” whereby one attends to several things at once—Instagram, TV, texting—but none deeply, is another; an environment that rewards short attention spans is a third. Wolf backs these observations with neurological data. She quotes other scientists who say, for instance, “[m]ultitasking created a dopamine-addiction feedback loop, effectively rewarding the brain for losing focus.” She cites data on the extent to which virtual reading has replaced paper reading (for example, 20-year-olds checking their phones 150-190 times per day). She points out how the “physical and temporal thereness of books” provides a tactile support for the reading circuit’s development that e-books do not, while noting e-books nonetheless continue to spread to classrooms and children’s bedrooms. “Behind our screens, at work and at home, we have sutured the temporal segments of our days so as to switch our attention from one task or one source of stimulation to another. We cannot but be changed.”

* * *

In these later letters, Wolf sounds like the kind of alarmist digital enthusiasts often deride. After all, they say, reading is not dying; it’s thriving. Wolf herself quotes a study from the University of California, San Diego, showing that an average user consumes 34 gigabytes of data per day—the equivalent of nearly 100,000 words.

Wolf’s answer comes, once again, from neuroscientific studies revealing significant cognitive and affective differences between print and screen reading, and between “deep reading” and fast reading—differences that show up in brain activity. In one study, researchers “were frankly surprised that just by asking their literature graduate students either to read closely or to read for entertainment, different regions of the brain became activated, including multiple areas involved in motion and touch.” In another, after one group read a story on paper, another on screens, the first reconstructed the plot more accurately than the second—for a book, unlike a virtual text, gives the brain a concrete spatial arrangement for the action. In sum, Wolf says, the paper reading brain has better memory, more imagination, immersion, and patience, and more knowledge than the screen reading brain. The physiology proves it.

Wolf’s pleading tone—“reader, please, come home”—follows from the fact that, in spite of the dangers, screen time is displacing book time. We are in trouble. “The more we read digitally,” she warns, “the more our underlying brain circuitry reflects the characteristics of that medium.” For six millennia, reading compelled the human brain to deepen and widen its cognitions. It is a glorious achievement, threatened by a “fundamental tension between our evolutionary wiring and contemporary culture.” We are moving backward.

* * *

Or are we? Harvard English professor Leah Price thinks Wolf’s worry overdone—that bibliophiles who regret the advent of screens are caught up in a myth of the deep, all-absorbed book reader whom they take as representative of reading before computers. Book-lovers such as Sven Birkerts (The Gutenberg Elegies, 1994), Nicholas Carr (“Is Google Making Us Stupid?”), and the National Endowment for the Arts staff who produced the 2004 report Reading at Risk (I worked there at the time) misconstrue reading’s history. Price mentions them all and chides them for seeing book reading as an intense, deliberate, solitary affair and believing in dreamy notions such “an unmediated communion between a reader’s mind and an author’s.” Their position is as much mood as belief, Price concludes—and “[t]hat mood is fear.”

In truth, Price writes, “serious, silent, solitary cover-to-cover reading has never been more than one of many uses to which print had been put.” Multitasking is nothing new. Reading was usually cursory, haphazard, and social—purposely so. And the digital practices bibliophiles regret have much in common with other uses of reading—for instance, reading a blog every other day and reading pamphlets that flooded cities in the 18th century. Reading wasn’t always idealized, either. The total absorption that anti-digital critics praise was often suspected of corrupting the youth and encouraging idleness. Finally, Price argues, reading habits have changed over time not only because of advances in print technology, but also because of changes in modern infrastructure such as the rise of public transportation. What bibliophiles really fear isn’t the disappearance of books but the elimination of the time and space needed to enjoy them.

Those are the contentions, which sound sweeping enough to require an encyclopedic treatment of book reading from ancient times forward. But What We Talk About When We Talk About Books has only 170 small, loosely-spaced pages of text. The presentation is casual, the prose fluid. Price mixes different phenomena freely—for example, her direct experience of books with projects of “bibliotherapy” (book reading in medical care). Renting the New York Public Library staircase for events, which became popular after the 2008 film version of Sex and the City, gets as much attention as reading in the antebellum and Jim Crow eras. In but a few pages we leap from 15th-century papal indulgences to print production in Britain in 1900 to Ben Franklin’s printing business (which produced no books) to the Amazon Kindle—then back to Jane Austen and her complaint that readers borrowed books instead of buying them. Facts come and go. At one point, in a section headed “Interleaf: Please Lay Flat” (which begins with Price injuring her spine hefting a backpack stuffed with tomes and a laptop), each line of text runs across the seam and covers both pages. It’s disconcerting, but makes Price’s point about the variability of reading.

* * *

The details price compiles are fascinating and amusing. They nicely illustrate, too, older book-uses more or less consonant with digital habits. For instance:

  • The most popular book in Colonial America after the Bible was the New England Primer, but “nowhere in the world today can you find a copy of the Primer published before 1727.” It was too common for librarians to bother collecting. Sometimes, the most influential books are judged disposable.
  • Book owning didn’t always mean book reading. Hemingway praised James Joyce’s Ulysses loudly, but “his copy survives crisp and clean to this day, with only the first and last pages cut.”
  • Celebrity potboilers have dominated the market for a long time. In 1813, Jane Austen’s Pride and Prejudice sold a meager 1,500 copies, while Robert Southey’s biography of Lord Nelson and his scandalous private life sold out twice.
  • Rather than touting their latest books as compelling page-turners, publishers sometimes did the opposite, playing up a book’s capacity to “be taken up and laid down without inconvenience,” as an 1835 ad put it. Price comments: “For us, interrupting a book denotes impulsivity and impatience. But for most of print’s history, it proved civilized self-restraint.”
  • The late Middle Ages had a nifty version of our cheap paperback (or tablet), the so-called “girdle book” that hung from one’s clothing and accompanied monks everywhere in a pre-modern form of “portability.”
  • Bookstores and libraries haven’t always been Ground Zero for browsing and buying or borrowing. One secret of Penguin’s commercial success was to sell books on train platforms and in tobacco shops.
  • In the past, reading could be a source of knowledge—or a sign of derangement, as it was for Don Quixote, as Cervantes tells us: “with virtually no sleep and so much reading he dried out his brain.” Goethe’s The Sorrows of Young Werther provoked numerous copycat suicides. Nineteenth-century libraries worried enough about the effect of excessive novel reading on young people to cap the number of books they could borrow.
  • The popular notion that web users have abandoned books is flat wrong, Price asserts, for “people of the book are also people of the cloud.” If you like to read online, you tend to like to read paper, too. This merely follows the historical record showing “how rarely one technology supersedes another.” Théophile Gautier declared in 1835 that the newspaper was killing the book; Thomas Edison in 1913 predicted that movies would make books obsolete in classrooms.

* * *

These are diverting facts and allusions. They add up to a persuasive case for the devoted bookworm as just one type of past reader. Nevertheless, the sound advice that we must beware of idealizing past reading doesn’t mean there is no present reading crisis. Price cites rising publishing revenues as evidence against the-sky-is-falling. What she doesn’t cite is a Bureau of Labor Statistics study that found Americans read “for personal interest” only 15 minutes a day—and 15-24 year-olds only six minutes per day, despite having five-and-a-half hours of leisure time to fill. The National Endowment for the Arts reports that from 1992 to 2017 the portion of adults who read any book (not for work or school) over the preceding year fell from 60.9% to 52.7%. A recent study in Psychology of Popular Media Culture noted that in the late ’70s 60% of teens reported reading a book or magazine every day; by 2016 the rate had plummeted to 16%. Test scores have followed suit. On the 2018 ACT, only 46% of test-takers reached “college readiness” in reading, a drop of six points from 2011. SAT verbal scores from 1972 to 2016 fell 36 points. All of this happened despite billions of education dollars and major legislation devoted to reading instruction. Low-income kids spend much more time on screens than do high-income kids, by the way, and they score more poorly on reading tests. (We still have a “digital divide,” but it’s the reverse of the one from 20 years ago when Democrats worried that the poor weren’t adequately connected and would fall further behind.) There is a reading problem, and the historical evidence Price assembles touches only its edges.

It won’t be long before all living memory of a time before the personal computer is gone. People will no longer address the meaning of screens from the remembered background of a computer-free life. Leah Price and Maryanne Wolf grew up with books; they had a print childhood, not a digital one. Price aims to soften the impact of the Digital Revolution, I suspect, because of a liberal impulse to accept cultural change with an urbane smile. That’s the going etiquette. I have witnessed many times my humanities colleagues receive news of popular culture drifting ever farther from their intellectual interests with a shrug. It is unseemly to them to criticize people for their cultural choices. But with every survey showing meager reading time and massive screen time in the leisure hours of the young, it is increasingly difficult not to share Wolf’s dismay.



from Hacker News https://ift.tt/2zgzFcl

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.