Clay Shirky is a professor of media studies at New York University, consultant on the Internet, and writer. He is writing here about teaching his University courses and a recent decision that he made. The post appeared September 9, 2014.
I teach theory and practice of social media at NYU, and am an advocate and activist for the free culture movement, so I’m a pretty unlikely candidate for internet censor, but I have just asked the students in my fall seminar to refrain from using laptops, tablets, and phones in class.
I came late and reluctantly to this decision — I have been teaching classes about the internet since 1998, and I’ve generally had a laissez-faire attitude towards technology use in the classroom. This was partly because the subject of my classes made technology use feel organic, and when device use went well, it was great. Then there was the competitive aspect — it’s my job to be more interesting than the possible distractions, so a ban felt like cheating. And finally, there’s not wanting to infantilize my students, who are adults, even if young ones — time management is their job, not mine.
Despite these rationales, the practical effects of my decision to allow technology use in class grew worse over time. The level of distraction in my classes seemed to grow, even though it was the same professor and largely the same set of topics, taught to a group of students selected using roughly the same criteria every year. The change seemed to correlate more with the rising ubiquity and utility of the devices themselves, rather than any change in me, the students, or the rest of the classroom encounter.
Over the years, I’ve noticed that when I do have a specific reason to ask everyone to set aside their devices (‘Lids down’, in the parlance of my department), it’s as if someone has let fresh air into the room. The conversation brightens, and more recently, there is a sense of relief from many of the students. Multi-tasking is cognitively exhausting — when we do it by choice, being asked to stop can come as a welcome change.
So this year, I moved from recommending setting aside laptops and phones to requiring it, adding this to the class rules: “Stay focused. (No devices in class, unless the assignment requires it.)” Here’s why I finally switched from ‘allowed unless by request’ to ‘banned unless required’.
We’ve known for some time that multi-tasking is bad for the quality of cognitive work, and is especially punishing of the kind of cognitive work we ask of college students.
This effect takes place over more than one time frame — even when multi-tasking doesn’t significantly degrade immediate performance, it can have negative long-term effects on “declarative memory," the kind of focused recall that lets people characterize and use what they learned from earlier studying. (Multi-tasking thus makes the famous “learned it the day before the test, forgot it the day after” effect even more pernicious.)
People often start multi-tasking because they believe it will help them get more done. Those gains never materialize; instead, efficiency is degraded. However, it providesemotional gratification as a side-effect. (Multi-tasking moves the pleasure of procrastination inside the period of work.) This side-effect is enough to keep people committed to multi-tasking despite worsening the very thing they set out to improve.
On top of this, multi-tasking doesn’t even exercise task-switching as a skill. A study from Stanford reports that heavy multi-taskers are worse at choosing which task to focus on. (“They are suckers for irrelevancy”, as Cliff Nass, one of the researchers put it.) Multi-taskers often think they are like gym rats, bulking up their ability to juggle tasks, when in fact they are like alcoholics, degrading their abilities through over-consumption.
This is all just the research on multi-tasking as a stable mental phenomenon. Laptops, tablets and phones — the devices on which the struggle between focus and distraction is played out daily — are making the problem progressively worse. Any designer of software as a service has an incentive to be as ingratiating as they can be, in order to compete with other such services. “Look what a good job I’m doing! Look how much value I’m delivering!”
This problem is especially acute with social media, because on top of the general incentive for any service to be verbose about its value, social information is immediately and emotionally engaging. Both the form and the content of a Facebook update are almost irresistibly distracting, especially compared with the hard slog of coursework. (“Your former lover tagged a photo you are in” vs. “The Crimean War was the first conflict significantly affected by use of the telegraph.” Spot the difference?)
Worse, the designers of operating systems have every incentive to be arms dealers to the social media firms. Beeps and pings and pop-ups and icons, contemporary interfaces provide an extraordinary array of attention-getting devices, emphasis on “getting.” Humans are incapable of ignoring surprising new information in our visual field, an effect that is strongest when the visual cue is slightly above and beside the area we’re focusing on. (Does that sound like the upper-right corner of a screen near you?)
The form and content of a Facebook update may be almost irresistible, but when combined with a visual alert in your immediate peripheral vision, it is—really, actually, biologically—impossible to resist. Our visual and emotional systems are faster and more powerful than our intellect; we are given to automatic responses when either system receives stimulus, much less both. Asking a student to stay focused while she has alerts on is like asking a chess player to concentrate while rapping their knuckles with a ruler at unpredictable intervals.
Jonathan Haidt’s metaphor of the elephant and the rider is useful here. In Haidt’s telling, the mind is like an elephant (the emotions) with a rider (the intellect) on top. The rider can see and plan ahead, but the elephant is far more powerful. Sometimes the rider and the elephant work together (the ideal in classroom settings), but if they conflict, the elephant usually wins.
After reading Haidt, I’ve stopped thinking of students as people who simply make choices about whether to pay attention, and started thinking of them as people trying to pay attention but having to compete with various influences, the largest of which is their own propensity towards involuntary and emotional reaction. (This is even harder for young people, the elephant so strong, the rider still a novice.)
Regarding teaching as a shared struggle changes the nature of the classroom. It’s not me demanding that they focus — its me and them working together to help defend their precious focus against outside distractions. I have a classroom full of riders and elephants, but I’m trying to teach the riders.
And while I do, who is whispering to the elephants? Facebook, Wechat, Twitter, Instagram, Weibo, Snapchat, Tumblr, Pinterest, the list goes on, abetted by the designers of the Mac, iOS, Windows, and Android. In the classroom, it’s me against a brilliant and well-funded army (including, sharper than a serpent’s tooth, many of my former students.) These designers and engineers have every incentive to capture as much of my students’ attention as they possibly can, without regard for any commitment those students may have made to me or to themselves about keeping on task.
It doesn’t have to be this way, of course. Even a passing familiarity with the literature on programming, a famously arduous cognitive task, will acquaint you with stories of people falling into code-flow so deep they lose track of time, forgetting to eat or sleep. Computers are not inherent sources of distraction — they can in fact be powerful engines of focus — but latter-day versions have been designed to be, because attention is the substance which makes the whole consumer internet go.
The fact that hardware and software is being professionally designed to distract was the first thing that made me willing to require rather than merely suggest that students not use devices in class. There are some counter-moves in the industry right now — software that takes over your screen to hide distractions, software that prevents you from logging into certain sites or using the internet at all, phones with Do Not Disturb options — but at the moment these are rear-guard actions. The industry has committed itself to an arms race for my students’ attention, and if it’s me against Facebook and Apple, I lose.
The final realization — the one that firmly tipped me over into the “No devices in class” camp — was this: screens generate distraction in a manner akin to second-hand smoke. A paper with the blunt title Laptop Multitasking Hinders Classroom Learning for Both Users and Nearby Peers says it all:
We found that participants who multitasked on a laptop during a lecture scored lower on a test compared to those who did not multitask, and participants who were in direct view of a multitasking peer scored lower on a test compared to those who were not. The results demonstrate that multitasking on a laptop poses a significant distraction to both users and fellow students and can be detrimental to comprehension of lecture content.
I have known, for years, that the basic research on multi-tasking was adding up, and that for anyone trying to do hard thinking (our spécialité de la maison, here at college), device use in class tends to be a net negative. Even with that consensus, however, it was still possible to imagine that the best way to handle the question was to tell the students about the research, and let them make up their own minds.
The “Nearby Peers” effect, though, shreds that rationale. There is no laissez-faire attitude to take when the degradation of focus is social. Allowing laptop use in class is like allowing boombox use in class — it lets each person choose whether to degrade the experience of those around them.
Groups also have a rider-and-elephant problem, best described by Wilfred Bion in an oddly written but influential book, Experiences in Groups. In it, Bion, who practiced group therapy, observed how his patients would unconsciously coordinate their actions to defeat the purpose of therapy. In discussing the ramifications of this, Bion observed that effective groups often develop elaborate structures, designed to keep their sophisticated goals from being derailed by more primal group activities like gossiping about members and vilifying non-members.
The structure of a classroom, and especially a seminar room, exhibits the same tension. All present have an incentive for the class to be as engaging as possible; even though engagement often means waiting to speak while listening to other people wrestle with half-formed thoughts, that’s the process by which people get good at managing the clash of ideas. Against that long-term value, however, each member has an incentive to opt out, even if only momentarily. The smallest loss of focus can snowball, the impulse to check WeChat quickly and then put the phone away leading to just one message that needs a reply right now, and then, wait, what happened last night??? (To the people who say “Students have always passed notes in class”, I reply that old-model notes didn’t contain video and couldn’t arrive from anywhere in the world at 10 megabits a second.)
I have the good fortune to teach in cities richly provisioned with opportunities for distraction. Were I a 19-year-old planning an ideal day in Shanghai, I would not put “Listen to an old guy talk for an hour” at the top of my list. (Vanity prevents me from guessing where it would go.) And yet I can teach the students things they are interested in knowing, and despite all the literature on joyful learning, from Marie Montessori on down, some parts of making your brain do new things are just hard.
Indeed, college contains daily exercises in delayed gratification. “Discuss early modern European print culture” will never beat “Sing karaoke with friends” in a straight fight, but in the long run, having a passable Rhianna impression will be a less useful than understanding how media revolutions unfold.
Anyone distracted in class doesn’t just lose out on the content of the discussion, they create a sense of permission that opting out is OK, and, worse, a haze of second-hand distraction for their peers. In an environment like this, students need support for the better angels of their nature (or at least the more intellectual angels), and they need defenses against the powerful short-term incentives to put off complex, frustrating tasks. That support and those defenses don’t just happen, and they are not limited to the individual’s choices. They are provided by social structure, and that structure is disproportionately provided by the professor, especially during the first weeks of class.
This is, for me, the biggest change — not a switch in rules, but a switch in how I see my role. Professors are at least as bad at estimating how interesting we are as the students are at estimating their ability to focus. Against oppositional models of teaching and learning, both negative—Concentrate, or lose out!—and positive—Let me attract your attention!—I’m coming to see student focus as a collaborative process. It’s me and them working to create a classroom where the students who want to focus have the best shot at it, in a world increasingly hostile to that goal.
Some of the students will still opt out, of course, which remains their prerogative and rightly so, but if I want to help the ones who do want to pay attention, I’ve decided it’s time to admit that I’ve brought whiteboard markers to a gun fight, and act accordingly.
|
18 de setembro de 2014
Why I Just Asked My Students To Put Their Laptops Away (Clay Shirky) by larrycuban
Postado por
jorge werthein
às
05:38
Assinar:
Postar comentários (Atom)
Nenhum comentário:
Postar um comentário