Brian X. Chen, "Help! My Smartphone Is Making Me Dumb—or Maybe Not," Wired, October 4, 2010. www.wired.com Copyright © 2010 by Wired. All rights reserved. Reproduced by permission.
"Is the smartphone—like Google, TV, comics and the movies before it—actually making us dumb?"
In the following viewpoint, Brian X. Chen asserts that the smartphone and its "always-on network" enable some individuals to multitask effectively. Contrary to research claiming that these technologies have only negative effects on concentration and memory, he says, other studies demonstrate that humans do have the capability to juggle their attention between activities. Indeed, Chen suggests, some experts point out that multitasking is not unique to technologies like smartphones, and research is still inconclusive regarding the effects of digital multitasking on the brain. Chen is a technology reporter for the New York Times and author of Always On: How the iPhone Unlocked the Anything-Anytime-Anywhere Future—and Locked Us In.
As you read, consider the following questions:
- What issue of causality arises with the Stanford University study supporting the negative effects of multitasking, as told by the author?
- How did "supertaskers" perform in a University of Utah study, as described by the author?
- What examples of multitasking with very little technology does the viewpoint offer?
Chicago resident Matt Sallee's life is a never-ending sprint that mostly takes place in his phone. At 5 in the morning the alarm goes off, and during his train commute the 29-year-old rolls through 50 e-mails he received overnight on his BlackBerry.
As a manager of global business development at an LED company, Sallee works in time zones spanning three continents.
"I love having 10 different things cooking at once, but for me it's all moving in little pieces, and when it comes time that there are big deliverables needed, I don't have to scramble at the last minute," Sallee said. "It's an hour of combining all the little pieces into one thing, and it's done."
It's not news the "always-on network" is eradicating the borders between home and office, and changing the way people work and play. But how much distraction can one person take? Research is still in the early stages, and there is little hard evidence that 24/7 access to information is bad for you. But the image of frantic, distracted workers scrabbling harder than ever for ever-diminishing social and economic returns is an attractive target for critics.
Not only is it annoying to see people chatting on cell phones in the popcorn line at the cinema, these devices—and the multitasking they encourage—could be taking a massive toll on our psyches, and perhaps even fundamentally altering the way our brains are wired, some dystopian-minded critics suggest.
Is the smartphone—like Google, TV, comics and the movies before it—actually making us dumb?
Some of the latest arguments to critique this 24/7 online culture include the book The Shallows by Nicholas Carr, who argues that the Internet is rewiring us into shallow, inattentive thinkers, along with a New York Times feature series by Matt Richtel titled "Your Brain on Computers," a collection of stories examining the possible negative consequences of gadget overload. (Disclosure: I'm currently writing a book called Always On that explores similar topics.)
Giving credence to such claims, an oft-cited Stanford study published last year found that people who were rated "heavy" multitaskers were less able to concentrate on a single task and also worse at switching between tasks than those who were "light" multitaskers.
"We have evidence that high multitaskers are worse at managing their short-term memory and worse at switching tasks," said Clifford Nass, a Stanford University professor who led the study. He's the author of the upcoming book The Man Who Lied to His Laptop: What Machines Teach Us About Human Relationships.
One test asked students to recall the briefly glimpsed orientations of red rectangles surrounded by blue rectangles. The students had to determine whether the red rectangles had shifted in position between different pictures. Those deemed heavy multitaskers struggled to keep track of the red rectangles, because they were having trouble ignoring the blue ones.
To measure task-switching ability, another test presented participants with a letter-and-number combination, like b6 or f9. Subjects were asked to do one of two tasks: One was to hit the left button if they saw an odd number and the right for an even; the other was to press the left for a vowel and the right for a consonant.
They were warned before each letter-number combination appeared what the task was to be, but high multitaskers responded on average half-a-second more slowly when the task was switched. The Stanford study is hardly undisputed. A deep analysis recently published by Language Log's Mark Liberman criticized the study for its small sample group: Only 19 of the students who took the tests were deemed "heavy multitaskers."
He added that there also arises an issue of causality: Were these high multitaskers less able to filter out irrelevant information because their brains were damaged by media multitasking, or are they inclined to engage with a lot of media because they have easily distractible personalities to begin with?
"What's at stake here is a set of major choices about social policy and personal lifestyle," Liberman said. "If it's really true that modern digital multitasking causes significant cognitive disability and even brain damage, as Matt Richtel claims, then many very serious social and individual changes are urgently needed."
"Before starting down this path, we need better evidence that there's a real connection between cognitive disability and media multitasking (as opposed to self-reports of media multitasking)," he added. "We need some evidence that the connection exists in representative samples of the population, not just a couple of dozen Stanford undergraduates enrolled in introductory psychology."
Other research also challenges the conclusions of the Stanford study. A University of Utah study published this year discovered some people who are excellent at multitasking, a class whom researchers dubbed "supertaskers."
Researchers Jason Watson and David Strayer put 200 college undergrads through a driving simulator, where they were required to "drive" behind a virtual car and brake whenever its brake lights shone, while at the same time performing various tasks, such as memorizing and recalling items in the correct order and solving math problems.
Watson and Strayer analyzed the students based on their speed and accuracy in completing the tasks. The researchers discovered that an extremely small minority—just 2.5 percent (three men and two women) of the subjects—showed absolutely no performance loss when performing dual tasks versus single tasks. In other words, these few individuals excelled at multitasking.
Also in contrast with the results of the Stanford study, the supertaskers were better at task-switching and performing individual tasks than the rest of the group.
The rest of the group, on the other hand, did show overall degraded performance when handling dual tasks compared to a single task, suggesting that the vast majority of people might indeed be inadequate at processing multiple activities. But the discovery of supertaskers argues with the ever-popular notion that human brains are absolutely not meant to multitask, Watson and Strayer say, and it shows that this area of research is still very much unexplored.
"Our results suggest that there are supertaskers in our midst—rare but intriguing individuals with extraordinary multitasking ability," Watson and Strayer wrote. "These individual differences are important, because they challenge current theory that postulates immutable bottlenecks in dual-task performance."
Born to Multitask
If the multitasking naysayers claim we're being drowned in data, the same can't be said of their studies. In fact, the research is far too early to be conclusive, argues Vaughan Bell, a neuropsychologist and clinician at the Universidad de Antioquia, Colombia.
"The idea that new technology is 'overloading us' in some way is as old as technology," he said. "Secondly, the idea that 'technology is damaging the brain' in some way just isn't borne out by the evidence."
Bell points out that multitasking is hardly a problem of the digital age—we've been doing it all along. We can dribble a basketball while running, jot down notes while listening to a lecture, and jog through the park while listening to music.
"If you think Twitter is an attention magnet, try living with an infant," Bell said. "Kids are the most distracting thing there is, and when you have three or even four in the house it is both impossible to focus on one thing—and stressful, because the consequences of not keeping an eye on your kids can be frightening even to think about."
(Kids are indeed distracting: A British study found that for drivers, the distraction of squabbling kids can slow down brake-reaction times by 13 percent—as much as alcohol.)
Bell added that residents of poorer neighborhoods that use very little technology (like Medellin, Colombia, where he resides) hardly live distraction-free lives. They have to watch their food because there is no timer; washing clothes has to be done by hand while keeping an eye on everything else; street vendors pass by the house and shout what they're selling, and if you miss that your family could go without food for a day. In short, the 24/7 multitasking lifestyle is nothing new, because for centuries, everywhere in the world there have been a multitude of demands competing for our attention resources, Bell said.
Both Bell and Stanford's Nass do agree on one major misconception: To say that the brain has been "damaged" as a result of multitasking is a dangerously inaccurate statement. The brain, after all, is supposed to change every moment of every day, because that's just what it does. A truly "damaging" effect on the brain can only be demonstrated by gross changes seen in the organ such as obvious tissue lesion or atrophy, Bell said.
A solid consensus on digital multitasking is unlikely to be reached anytime soon, perhaps because the Internet and technology are so broadly encompassing, and there are so many different ways we consume media. Psychological studies have seen a mix of results with different types of technology. For example, some data shows that gamers are better at absorbing and reacting to information than non-gamers. In contrast, years of evidence suggest that television viewing has a negative impact on teenagers' ability to concentrate. The question arises whether tech-savvy multitaskers could consume different types of media more than others and/or be affected in different ways.
A research paper authored by a group of cognitive scientists titled "Children, Wired: For Better and for Worse" sums it up best:
"One can no more ask, 'How is technology affecting cognitive development?' than one can ask, 'How is food affecting physical development?'" the researchers wrote. "As with food, the effects of technology will depend critically on what type of technology is consumed, how much of it is consumed, and for how long it is consumed."
Hooked on Social Media
Researchers are continuing to examine the effects of multitasking on our ability to focus, but another soon-to-be-published study offers a glimpse into a potentially negative social change in the smartphone era. University of Kansas researchers recently polled a group of 348 students and found that most of them (83 percent) believed that texting while driving was unsafe—even more unsafe than talking on the phone while driving—but 98 percent of them admitted to doing it anyway. The study's 89-item questionnaire asked students to rate perceived risks of different types of texting (initiating a text or replying to one) as well as texting during various driving conditions. Most interesting, the study found that students rated driving on the highway to be intensely risky, but most drivers reported they were just as likely to initiate a text while driving on the highway as they would while driving in normal road conditions. In other words, they were able to convince themselves that road conditions were safer, which made it justifiable to text.
"People know it's harmful, and yet they keep doing it, and they tell themselves they have to do it," said Paul Atchley, an associate professor of psychology who led the study.
Atchley chalks this behavior up to the theory of cognitive dissonance: when you persuade yourself that a behavior is less risky by engaging in it—similar to how the perceived risk of smoking declines in smokers, or the perceived risk of drunk driving decreases when people drive drunk.
But why do people feel they have to text while driving? Social networking sites coupled with a constant Internet connection everywhere you go support the need for individuals to belong, and some research has shown that exclusion from social networks and text messaging can reduce the feeling of belonging. In short, we grow attached to the lifestyle of being always on, and we want to stay plugged in—even when it's a bad idea.
"We're social organisms. There're so many mechanisms built into the brain that are designed for socialization," he said. "The telecommunications industry has hit on something we're built to do."