Society has long cherished the ability to think beyond the ordinary. In a world where knowledge is revered and innovation equals progress, those able to bring forth greater insight and understanding are destined to make their mark and blaze a trail to greater enlightenment.
“Critical thinking as an attitude is embedded in Western culture. There is a belief that argument is the way to finding truth,” observes Adrian West, research director at the Edward de Bono Foundation U.K., and a former computer science lecturer at the University of Manchester. “Developing our abilities to think more clearly, richly, fully—individually and collectively—is absolutely crucial [to solving world problems].”
To be sure, history is filled with tales of remarkable thinkers who have defined and redefined our world views: Sir Isaac Newton discovering gravity; Voltaire altering perceptions about society and religious dogma; and Albert Einstein redefining the view of the universe. But in an age of computers, video games, and the Internet, there’s a growing question about how technology is changing critical thinking and whether society benefits from it.
Although there’s little debate that computer technology complements—and often enhances—the human mind in the quest to store information and process an ever-growing tangle of bits and bytes, there’s increasing concern that the same technology is changing the way we approach complex problems and conundrums, and making it more difficult to really think.
“We’re exposed to [greater amounts of] poor yet charismatic thinking, the fads of intellectual fashion, opinion, and mere assertion,” says West. “The wealth of communications and information can easily overwhelm our reasoning abilities.” What’s more, it’s ironic that ever-growing piles of data and information do not equate to greater knowledge and better decision-making. What’s remarkable, West says, is just “how little this has affected the quality of our thinking.”
According to the National Endowment for the Arts, literary reading declined 10 percentage points from 1982 to 2002 and the rate of decline is accelerating. Many, including Patricia Greenfield, a UCLA distinguished professor of psychology and director of the Children’s Digital Media Center, Los Angeles, believe that a greater focus on visual media exacts a toll. “A drop-off in reading has possibly contributed to a decline in critical thinking,” she says. “There is a greater emphasis on real-time media and multitasking rather than focusing on a single thing.”
Nevertheless, the verdict isn’t in and a definitive answer about how technology affects critical thinking is not yet available. Instead, critical thinking lands in a mushy swamp somewhere between perception and reality; measurable and incomprehensible. It’s largely a product of our own invention—and a subjective one at that. And although technology alters the way we see, hear, and assimilate our world—the act of thinking remains decidedly human.
Rethinking Thinking
Arriving at a clear definition for critical thinking is a bit tricky. Wikipedia describes it as “purposeful and reflective judgment about what to believe or what to do in response to observations, experience, verbal or written expressions, or arguments.” Overlay technology and that’s where things get complex. “We can do the same critical-reasoning operations without technology as we can with it—just at different speeds and with different ease,” West says.
What’s more, while it’s tempting to view computers, video games, and the Internet in a monolithic good or bad way, the reality is that they may be both good and bad, and different technologies, systems, and uses yield entirely different results. For example, a computer game may promote critical thinking or diminish it. Reading on the Internet may ratchet up one’s ability to analyze while chasing an endless array of hyperlinks may undercut deeper thought.
Michael Bugeja, director of the Greenlee School of Journalism and Communication at Iowa State University of Science and Technology, says: “Critical thinking can be accelerated multifold by the right technology.” On the other hand, “The technology distraction level is accelerating to the point where thinking deeply is difficult. We are overwhelmed by a constant barrage of devices and tasks.” Worse: “We increasingly suffer from the Google syndrome. People accept what they read and believe what they see online is fact when it is not.”
One person who has studied the effects of technology on people is UCLA’s Greenfield. Exposure to technology fundamentally changes the way people think, says Greenfield, who recently analyzed more than 50 studies on learning and technology, including research on multitasking and the use of computers, the Internet, and video games. As reading for pleasure has declined and visual media have exploded, noticeable changes have resulted, she notes.
“Reading enhances thinking and engages the imagination in a way that visual media such as video games and television do not,” Greenfield explains. “It develops imagination, induction, reflection, and critical thinking, as well as vocabulary.” However, she has found that visual media actually improve some types of information processing. Unfortunately, “most visual media are real-time media that do not allow time for reflection, analysis, or imagination,” she says. The upshot? Many people—particularly those who are younger—wind up not realizing their full intellectual potential.
Greenfield believes we’re watching an adaptation process unfold. Today, many individuals perform better at common tasks but this doesn’t make them better at thinking. The ability to multitask and use technology is highly beneficial in certain fields, including medicine, business, and flying aircraft. Consider: video game skills are a better predictor of surgeons’ success in performing laparoscopic surgery than actual laparoscopic surgery experience. One study found that the best video game players made 47% fewer errors and performed 39% faster in laparoscopic tasks than the worst video game players.
“Most visual media are real-time media that do not allow time for reflection, analysis, or imagination,” says Patricia Greenfield.
Tools for Learning
How society views technology has a great deal to do with how it forms perceptions about critical thinking. And nowhere is the conflict more apparent than at the intersection of video games and cognition. James Paul Gee, a professor of educational psychology at the University of Wisconsin-Madison and author of What Video Games Have to Teach Us About Learning and Literacy, points out that things aren’t always as they appear. “There is a strong undercurrent of opinion that video and computer games aren’t healthy for kids,” he says. “The reality is that they are not only a major form of entertainment, they often provide a very good tool for learning.”
In fact, a growing number of researchers—and an expanding body of evidence—indicate that joysticks can go a long way toward building smarter children with better reasoning skills. Games such as Sim City, Civilization, Railroad Tycoon, and Age of Mythology extend beyond the flat earth of rote memorization and teach decision-making and analytical skills in immersive, virtual environments that resemble the real world, Gee says. Moreover, these games—and some virtual worlds—give participants freedom to explore ideas and concepts that might otherwise be inaccessible or off limits.
Kurt Squire, a University of Wisconsin-Madison associate professor in educational communications and technology, has found that as children play an educational game and learn about a particular period in history or an interesting concept, they often want to learn more. For example, one young student Squire studied sent him a list of 27 books on ancient history the boy had checked out of a library as a result of playing the game Civilization. What makes the games so compelling, he relates, is they create a psychological investment by “structuring problems so that they are just beyond students’ current abilities.”
One thing is certain. In the digital age, critical thinking is a topic that’s garnering greater attention. As reading and math scores decline on standardized tests, many observers argue that it’s time to take a closer look at technology and understand the subtleties of how it affects thinking and analysis. “Without critical thinking, we create trivia,” Bugeja concludes. “We dismantle scientific models and replace them with trendy or wishful ones that are neither transferable nor testable.”
Join the Discussion (0)
Become a Member or Sign In to Post a Comment