Opinion
Computing Applications Viewpoint

Does the Internet Make Us Stupid?

Yes, but this may not be as bad as it sounds...
Posted
  1. Introduction
  2. Why So Negative?
  3. An Unpleasant Surprise
  4. Do We Need to Worry?
  5. Conclusion
  6. References
  7. Author
  8. Footnotes
Does the Internet Make Us Stupid? illustration

According to Farhood Manjoo in Slate magazine, almost nobody finishes reading papers online.5 Jakob Nielsen has shown that superficial reading carries over from screens to printed material.a Bauerlein,1 Brabazon,2 Carr,4 and others have argued convincingly that the Internet and other information and communication technologies (ICTs) are changing our reading habits to "skimming" rather than careful reading. Putting this together, I am concerned you will not read this Viewpoint carefully to the end. Hence, I better present my conclusions right away: ICTs are indeed reducing many of our cherished cognitive facilities, much as our physical fitness has been reduced by all kinds of machinery for physical work and locomotion. However, in my opinion, this is not too bad, as long as our reduced facilities are overcompensated by appropriate technology, and provided we make sure of two things: that we are not completely lost in case of large-scale breakdowns of technology, and that use of ICTs does not endanger our creativity. Both provisos are starting to receive attention: the first will hopefully be solved by introducing systems with sufficient redundancy; the second attracts varying opinions: some, like Carr, see mainly dangers, others like Thompson,7 see our future in a growing man-machine symbiosis. My own opinion is that creativity is not endangered if the new technologies are used with some caution.

If you stop reading here, you have read the important part of the message. If you continue reading, I hope I can drive home the message with emotional emphasis.

Over the last six years or so, numerous papers and books have claimed the Internet and related technologies are reducing our cognitive abilities. Here are some better-known examples and quotes.

Brabazon writes: Looking at schools and universities, it is difficult to pinpoint when education, teaching and learning started to haemorrhage purpose, aspiration and function. As the Internet offers a glut of information, bored surfers fill their cursors and minds with irrelevancies, losing the capacity to sift, discard and judge.2 Brabazon is particularly worried by evidence she collected herself that reading with understanding and creative writing is markedly reduced in students who use ICTs intensively, and that concentrated thinking and attention spans appear much reduced. This is echoed by many later publications and books, including the ones discussed here.

The title of Bauerlein's book The Dumbest Generation1 and its subtitle How the Digital Age Stupefies Young Americans and Jeopardizes Our Future (Or, Don't Trust Anyone Under 30) clearly indicate he is thinking along the same lines. Bauerlein was probably also the first to diagnose that ICTs are increasing the generation gap, since young people, in their effort to be "in," learn increasingly more from peers than from adults.

Carr states very strongly that new technologies including the Net fragment contents and disrupt our concentration.4 He emphasizes one important aspect that has been recognized by neuroscience and brain science for some time: use of new media like the Net creates new habits and changes the brain: the plasticity of the brain can work against us by reinforcing certain behavior. Carr gives many examples that the way people read and write has already been changed by the Net that … contributes to the ecosystem of interruption technologies," even causing newspapers to change their style, by turning to shorter stories, or such. He quotes the neuro-scientist Michael Merzenich, professor emeritus at the University of California in San Francisco, as saying he was profoundly worried about the cognitive consequences of the constant distractions and interruptions the Net bombards us with. The long-term effect on the quality of our intellectual lives could be "deadly."


Creativity is not endangered if the new technologies are used with some caution.


Weber's book, The Google-Copy-Paste Syndrome, with statements such as: The real danger is not that plagiarism is used fraudulently for personal gain, but that the copy/paste mentality destroys thinking (translated from German by the author) emphasizes another aspect.

The title of Spitzer's book Digital Dementia: How We Drive Ourselves and Our Children Mad (translated from German by the author) says it all, even if you do not read to the point where he claims: Computers for teaching subjects like history are as important as bicycles for teaching how to swim (translated from German by the author).

Back to Top

Why So Negative?

The previously mentioned authors share a number of concerns: We are so inundated by information that our attention span has become very small. We are heading (my terminology) toward a global attention deficit syndrome: we cannot be inactive any more, yet we are losing the power to concentrate. If we are not drowned by email, by text messages, or tweets telling us to look at some site or YouTube clip, or browsing our friends' updates on a social network, listening to mp3 music, zapping through 100 TV shows, answering the phone, or calling someone, we somehow must find another way to occupy ourselves. Many people waste so much time keeping their "friends" in the network happy they have little time left for productive work. They have externalized much of their knowledge into the cloud and their smartphones. They do not need to remember many facts, threatening the functioning of their memory. Students do not write essays these days. Rather, using Google search, copy, and paste, they glue pieces of information together, hardly understanding what they are producing. They are no longer able to read complicated texts, spoiled by bite-sized pieces of information contained in tweets or in text messages, often written in a new shorthand: "sms r good 4u." A blind trust that e-learning can replace good teachers often leads to less-educated children; apps are substituting thinking and cognitive capacity is shrinking.

These are some of the concerns often mentioned. If they were simply concerns, we could brush them off. However, almost all of them are based on solid quantitative research and experiments. Hence, we must take them seriously. The technologies involved do indeed make us, or upcoming generations, more stupid, measured by the cognitive strength of brains.

Back to Top

An Unpleasant Surprise

All of us working in computer science have realized ICT helps us with very good access to information, research papers, and communication with colleagues, ensuring we do not end up in cul-de-sacs and are exposed to new ideas rapidly. ICT is above all a powerful and positive force in many areas like medicine, transportation, and production, just to name a few. It has also made many mundane tasks easier, such as booking a hotel, an event, or a trip. But it has also produced serious problems of privacy, of indirect control over us by others, of increased violence through violent games (as Bushman states convincingly3). And it is possibly creating new kinds of warfare, and yes, in a few aspects it has made us lazier.

Or maybe it has just relieved us of some tasks to create room for new challenges? Some superficial arguments concerning cognitive tasks might be: Why should I do complicated calculations when my smartphone has a built-in calculator? Why worry about counting change, if I pay by credit card anyway? Why bother about spelling mistakes when my spell-checker makes fewer mistakes than my teachers did? Why remember phone numbers when my smartphone has speech activation? I do not worry if I cannot find my phone at home: I use my wife's phone and the ringing of my phone makes sure I can locate it. Too bad my shoes do not ring yet, but soon NFC devices will help me find them, or anything else of interest to me, for that matter. Handwriting—what the heck! I dictate most things these days, or else use a keyboard. With the language app on my phone, I can converse on a simple level in any language of the world. I forgot where I found the app, but I am sure you will be able to locate it. Even before I used the English-Japanese language app, it was easy for me to order in a Japanese restaurant: they all have plastic replicas of the meals in the window, so I just took a picture with my digital camera and showed it to the waiter. I find it convenient when hiking that I no longer have to memorize details of a map. What does it matter that my sense of orientation may have deteriorated, my smartphone can find any place in any city or on any hike with a few taps.

In addition, there are all the benefits from using ICT in important applications like medicine, transportation, production, and so forth, as previously described.

And now I am supposed to believe all those great achievements come at the price of increasing stupidity! Does this mean we technologists will soon have to ask ourselves the same question physicists had to ask themselves in connection with nuclear weapons: Do we contribute positively to mankind or do we threaten it, because we are reducing the capacity of humans for deep logical thinking?

Back to Top

Do We Need to Worry?

One can argue there is no need to worry if some of our cognitive facilities are reduced due to technology, as long as the loss is overcompensated by technology and as long as we can assure two crucial points: independent and creative thinking must not be threatened and we must still be able to function in a reasonable "basic mode" if technologies fail. The difficulty is that we do not know at this point how much knowledge we can "outsource" into the Net and computers without reducing creativity. Also, our infrastructure is inadequate for a massive breakdown of the Net or the electric grid. The first issue requires serious research in neurosciences, the second research and attention by engineers in a number of disciplines to provide enough redundancy. Unfortunately, providing redundancy will have its cost, hence will meet resistance.


The difficulty is that we do not know at this point how much knowledge we can "outsource" into the Net and computers without reducing creativity.


We must stop looking at humans as naturally, biological grown beings. Rather, we must understand ourselves as organic beings in symbiosis with technology. I myself am a good example. I am middle-ear deaf. That is, without very special hearing aids I would not hear a thing. I wear eyeglasses or I would see everything blurred. My pacemaker keeps my heart beating properly. And the metal plate in my replacement hip is perfect; well, when I go through airport security, I sometimes have to show a medical statement about that piece of metal. If you were to take away all this technology, I would be physically impaired at best, but probably dead. As is, I can hike, scuba dive, go to concerts, do research, and even make it into Communications once in 20 years.

In other words, we should not judge persons now and in the future without the technological tools they are using, whether those tools are built-in (pacemaker) or external (hearing aid, smartphone, reading software for visually challenged persons, tablet PCs, Google Glass). We have long accepted this for physical properties: my grandfather was strong: he could carry 50kg 20 kilometres in four hours! Well, I can do better: I can carry 250kg 200 kilometres in two hours with my car. If I were to encounter an adversary, I would still prefer it to be an unarmed body-builder rather than someone skinny with a machine gun. What is happening now is that technology is starting to also replace some cognitive functions, reducing our very own capabilities like our memory or orientation facilities. This raises an important concern: Does our increasing dependency on technology sabotage our ability to think for ourselves? Surely, looking up some facts is not the same as coherent logical thinking. Is the latter in danger? The answer is frustrating: none of us knows. As an optimist I hope we can make good use of the possibility offered to us by easy access to high-quality research and arguments and discussion with colleagues, without losing our power of thinking.

Despite my positive attitude, I am aware of the two important aspects hinted at before. The first is that if we rely on technologies, we should also ensure we have a backup when those technologies fail. I do not think this issue has been taken seriously enough in the past: we should be more careful to have solutions if, for example, electricity, transportation systems, water supply, or other services fail for an extended period over a large area. With ICT's enormous influence on our lives and cognitive capabilities, this issue is becoming more pressing.


We must understand ourselves as organic beings in symbiosis with technology.


Secondly, repeated actions change our brain and hence how we think. As such, this is nothing new: physical work also changes how we act, since our muscles get stronger. Yet the danger that creativity is threatened because we "outsource" too much into the Net or delegate it to computer algorithms is real: we must not empty our brain or it seems likely we might lose the capability to think clearly and bring together important facts: no links in the Net can do this for us. Knowing how to navigate the Net does not make up for synapses generated in our brain. We will be using algorithms doing some job for us (like calculating something, speech translations, finding a route, and other functions). Yet it is clear some basic information and the facility to do serious logical thinking must not disappear. Whether logical thinking can be best achieved by learning mathematics, or by some other means like learning how to play chess or bridge (might be more fun) will still have to be determined. To find good answers requires more neuroscience and brain research: what capabilities do we need in our brains to remain creative? What do we lose and what do we gain, if instead of retaining all the details of one book on a particular topic, we retain a few details of many books with different views on that topic in our brain? At this point in time nobody seems to have valid answers, hence this is an important and crucial research topic but more for neuroscience than computer science. When these important questions have been answered we will know whether we need a personal trainer for our minds as we need one for our bodies!

Back to Top

Conclusion

There is no doubt ICT has had a positive influence on many aspects of our life. Looking specifically at thinking and at the discovery of new results and doing research, it is clearly positive that we can easily access new research results and communicate worldwide as a basis for own imaginative thinking, possibly added by the ability to deal with foreign languages or getting better insights through new kinds of visualization. However, the Net (and other ICTs) will also "make us more stupid" as far as some of our cognitive capabilities are concerned. This should be accepted by us if two important points are not forgotten: that we must not be completely dependent on technology and that we must retain the capability for logical thinking and creativity. Only then can we judge the balance of what we gain against what we lose.

Before we understand the full impact of ICT on our brain and our thinking, caution is essential. Thus, major challenges for further research and the study of behavioral patterns are to find out what we can outsource and what we better retain in our own brains.

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More