There's a telling episode about a quarter of the way into Now You See It, Cathy N. Davidson's impassioned manifesto on the way digital tools should transform how we learn and work. Davidson is—well, it's hard to say what she is. Her official title is the John Hope Franklin Humanities Institute professor of interdisciplinary studies at Duke University. She is also the co-founder of the Humanities, Arts, Science, and Technology Advanced Collaboratory, "a network of innovators dedicated to new forms of learning for the digital age."
Davidson was doing whatever it is that she does at work one morning when her colleagues' kids, three 9-year-olds released from school on a snow day, began playing a video game in a nearby office. This delighted Davidson. In line with her unconventional job description, she likes anything that departs from the customary way of doing things, especially the customary way of educating children. At the end of the day, after they'd spent eight straight hours playingLittleBigPlanet, Davidson intercepted the kids, "eager to hear all that they had learned in a day of the kind of interactive, challenging game play that we were pushing as the next frontier of learning."
"What did you learn today?" she asked brightly. Their answer was not what she was hoping for. They "dissolved into helpless, hysterical laughter." Nothing, they told her. "You didn't learn anything?" Davidson tried again. "No! No!" The kids laughed some more. "No way." Davidson returned to her office, feeling crushed. But she soon rallied: These children have been so mishandled by traditional schools, she decided, that they don't know what learning is.
Another interlocutor might have taken the kids at their word: They did not actually learn much from their day sitting in front of the screen. But the author's response is classic Davidson. She is easily moved to rapture and to dismay, propelled by an enthusiasm for anything new and digital and by an almost allergic aversion to any practices or artifacts from the pre-Internet era. Nor does she have much use for people born before 1980. The inhabitants and the knowledge of the past are merely obstacles to be trampled in the headlong rush toward an interactive, connected, collaborative future. This future, as Davidson imagines it, is one in which games replace books, online collaboration replaces individual effort, and "crowdsourced" verdicts replace expertise. Electronic media have turned all of society's institutions on their heads, and in this topsy-turvy new world it is the young who will lead the old, the students who will instruct the teachers, the interns who will oversee the bosses. Adults should be taking their cues from children (except, apparently, when children don't say what we want them to).
Davidson's youth worship, though extreme, is common these days among those who write about technology and society. Individuals born after the dawn of the Internet are not the same as you and me, goes the now-familiar refrain. As a result of their lifelong immersion in electronic media, young people's brains are "wired differently," and they require different schools, different workplaces, and different social arrangements from the ones we have. They are described, with more than a little envy, as "digital natives," effortlessly at home in an electronic universe, while we adults are "digital immigrants," benighted arrivals from the Old World doomed to stutter in a foreign tongue.
But before this view calcifies into common wisdom, it's worth examining whether it's an accurate or useful understanding of generational change. Thinkers like Davidson who insist on difference and disjunction, on a chasm between then and now, us and them, overlook important continuities that call such accounts into serious question. In particular, Davidson's claim that young people can, and should, pay attention in new and different ways doesn't stand up to scrutiny.
There is the essential sameness, first of all, of the neural architecture of all humans, both young and old. According to Davidson, our habits of attention are entirely learned. Pre-Google adults learned to attend to static words printed on paper; our digital-age children have learned to attend to hypertext and multimedia. She fails to acknowledge, however, that we're all using the same basic equipment, shaped in specific ways by evolution. "What does it mean to say that we learn to pay attention?" Davidson writes. "It means no one is born with innate knowledge of how to focus or what to focus on. Infants track just about anything and everything and have no idea that one thing counts as more worthy of attention than another. They eventually learn because we teach them, from the day they are born, what we consider to be important enough to focus on."
This is simply wrong. Experiments by Andrew Meltzoff of the University of Washington and many others have shown that newborns only minutes old choose to focus on human faces over inanimate objects—a preference found across time and across cultures—and will even imitate simple gestures like sticking out the tongue. A study published earlier this summer found that the brains of infants as young as three months of age show greater activity when they hear sounds made by people than when they hear equally familiar sounds made by toys or water—evidence of the early development of a brain region specialized for inter-human communication. Throughout their lives, people focus on the prospects of danger, sex, and food, not because our parents taught us these things are important but because that's how we've been built to survive.
There are many other more subtle biases of the evolved human brain—its tendency to focus on the thing that changes rather than the thing that's constant, for example, or its predisposition to remember stories and pictures over abstract ideas—and their pervasive influence in shaping the way all of us focus and pay attention makes the idea that young people are "wired" completely differently seem rather facile. Our parents grew up listening to the radio, and we grew up watching TV; if our children grow up Facebook-ing and YouTube-ing, that's less a radical break than the elaboration of a technology that, after all, we created and shared with them.
Davidson neglects, too, the research showing the ways in which our attention is constrained by the cognitive capacities of our brains. She dismisses the value of single-minded focus, and the concerns of students and workers who struggle to cope with the multiplying demands on their attention. "If we feel distracted, that is not a problem," she declares. "It is an awakening." What we call distractions, Davidson says, are actually "opportunities for learning." We must give up the old, 20th-century way of paying attention, suited to a vanishing industrial era, and adopt a new, 21st-century way of paying attention, tailored to a digital epoch. Her position ignores the inflexible and near-universal limits on our working memory, which allow us to hold only a few items of information in our consciousness at a time, or the work of researchers like Clifford Nass of Stanford University. "Human cognition is ill-suited both for attending to multiple input streams and for simultaneously performing multiple tasks," Nass has written. In other words, people are inherently lousy at multitasking. Contrary to the notion that those who've grown up multitasking a lot have learned to do it well, Nass'sresearch has found that heavy multitaskers are actually less effective at filtering out irrelevant information and at shifting their attention among tasks than others.
If it seems that some young people are more adept than their elders at handling multiple streams of information—at, say, doing their homework while also emailing, texting, Googling, Digging, iTuning, and Angry Birding—that may be a developmental difference rather than a cultural one. As we grow older, our brains change in predictable ways. We're less able to block out distractions, less able to hold many facts in our working memory. Members of the Internet generation aren't some exotic new breed of human, in other words. They're simply the young of the same species. And they won't be young forever. The digital age has brought all of us new and exciting tools that will surely continue to alter the way we learn and work. But focusing one's attention, gathering and synthesizing evidence, and constructing a coherent argument are skills as necessary as they were before—in fact, more necessary than ever, given the swamp of baseless assertion and outright falsehood that is much of the Web. Some day not too far in the future, the digital natives may find themselves turning down the music, shutting off the flickering screen, silencing the buzzing phone and sitting down to do just one thing at a time.
Nenhum comentário:
Postar um comentário