Sign In

Communications of the ACM

News

Making Automation Work


View as: Print Mobile App ACM Digital Library Full Text (PDF) In the Digital Edition Share: Send by email Share on reddit Share on StumbleUpon Share on Hacker News Share on Tweeter Share on Facebook
D.C. Metro subway crash

The June 22, 2009 Metro subway crash in Washington, D.C., in which nine people died and 80 were injured, is the deadliest accident in the Metro's 33-year history.

Credit: Pablo Martinez Monsivais / AP Photo

It's no secret that engineers and designers constantly seek to build safer and more convenient systems. And, over the last century, planes, trains, automobiles, and industrial machines have become far more automated and efficient. However, when a Metro subway train rammed into another train in Washington, D.C. last June, designers had to confront the unpleasant reality that automation may have been the cause. The accident, which killed nine people and injured 80, may have been rooted in a computer malfunction and the operator's inability to manually apply the brakes quickly enough.

The Metro train accident lies at the heart of what human factors experts refer to as the "automation paradox." As automated systems become increasingly reliable and efficient, the more likely it is that human operators will mentally "switch off" and rely upon the automated system. And as the automated system becomes more complex, the odds of an accident or mishap may diminish, but the severity of a failure is often amplified.

As John D. Lee, a professor of industrial and systems engineering at the University of Wisconsin at Madison told the Washington Post: "The better you make the automation, the more difficult it is to guard against these catastrophic failures...."

Understanding how people and machines interact is infinitely complex. Programming all the various possibilities and scenarios into a system can tax even the best design and engineering experts. What's more, as technology evolves, the entire process grows more convoluted and iterative. In some cases, experts say, it's wise to ask what purpose automation serves and when it's best to use it and eschew it.

What is the fallout from automation glitches? Where do programmers, designers, and engineers typically fall short? And what can technologists do to build better systems? There are no simple solutions. But as Donald Norman, professor of computer science at Northwestern University, co-founder of Neilson Norman Group, and author of The Design of Future Things, says, "Designers often make assumptions or act on incomplete information. They simply don't anticipate how systems will be used and how unanticipated events and consequences will occur."

Back to Top

Human-Machine Interface

It's clear that automation has provided enormous gains to society. Safer and more efficient factories; faster police, emergency, and fire response; and more user-friendly and safer automobiles are only a few of the benefits.

Yet, at the same time, it takes little effort to find evidence of breakdowns between human and machine.

The crash of Air France Flight 447 that occurred over the Atlantic Ocean last June—killing all 228 people aboard—may have been caused by a malfunction in a speed sensor. The plane's Pitot tubes, a pressure measurement instrument used to track fluid flow velocity, may have became blocked by ice. At that point, they may have stopped emitting signals, and experts say that the pilots could have encountered false speed readings. In fact, the jet—which was coping with a series of storms, including a severe thunderstorm—reportedly relayed a signal that its computer system no longer knew the speed of the aircraft, and that automatic pilot and thrust functions were switched off. This may have forced the pilots to take over manual control during chaotic, if not impossible, flying conditions.

There are also plenty of examples of humans having trouble with automation systems in everyday life. As automobiles become more automated, new problems crop up. For instance, motorists blindly follow the incorrect directions provided by a navigation system, even though a glance at the road would indicate there's an obvious error. A few motorists have even driven off a cliff or into oncoming traffic after following directions explicitly. What's more, studies show that many motorists use automation features, such as adaptive cruise control, incorrectly. In some cases, Norman says, these automated systems cause the car to speed up as motorists exit a highway because there's suddenly no car in front. If a driver isn't paying attention, an accident can occur.

In the case of airplane pilots and train operators, one solution is regular training sessions in which the pilot or operator is required to turn off their automated system and operate everything manually. This can help them retain their skills and alertness.

But even this is not likely to eliminate breakdowns. Human-machine interface failures occur for a number of reasons, experts say. Sometimes, designers rely on a wrong set of assumptions to build a system. They simply don't understand the way people use technology or the cultural differences that occur. In some instances, thousands and sometimes millions of variables exist and capturing everything in a single algorithm is exceedingly difficult. In fact, Norman argues that machine logic doesn't necessarily jibe with the human brain. "If you look at 'human error' it almost always occurs when people are forced to think and act like machines," he says.

Worse, complex algorithms often prompt humans to relate to devices as if they were fellow human beings. As a result, the autopilot on a plane, the cruise control on a car, and automated speed-control systems in mass transit become either aids or crutches, depending the situation.

Too often, the sum of a system is not equal to the individual parts, says Sidney W. A. Dekker, director of research at the Leonardo da Vinci Center for Complexity and Systems Thinking at Lund University in Sweden. "There is often a great deal of human intuition involved in a process or activity and that's not something a machine can easily duplicate," says Dekker. "If you look at delivering babies, there's a reason we have midwives and nurses. Machines can monitor and help, but they can't detect subtle signs and they're unable to adapt to situations as seamlessly."

David D. Woods, professor of cognitive engineering at Ohio State University, says that designers can easily succumb to the trap of thinking "a little more technology will solve the problem." However, understanding variables and identifying possible exceptions and disruptions is paramount. For example, when the Metro D.C. train crashed, it may have been due to wet leaves on the tracks and a computerized system that wasn't programmed for such a scenario. "The automation system functioned as it was designed," Woods says. "The situation simply fell outside the model of what engineers envisioned."

Back to Top

Beyond Failure

Make no mistake, human factors experts constantly scrutinize automation. Many believe that if human error exists, it falls on the shoulders of those engineering, designing, and programming technology. "In reality, there is no such thing as operator error. Too often, systems aren't designed as whole and those creating them overlook important factors," argues Nancy Leveson, professor of aeronautics and astronautics at Massachusetts Institute of Technology and author of the forthcoming book Engineering a Safer World.

Yet, progress is taking place. Consider the airline industry: In 1989, 1.4 crashes per 1 million departures occurred. By 2008, the number had dropped to 0.2 fatal accidents per 1 million departures. In fact, crashes have steadily dropped over the decades while survivability has increased. Dekker, who is a pilot and has flown various aircraft, including a Boeing 737, says that the industry has gotten serious about stamping out flaws, bugs, and oversights.

These improvements have taken place because the airline industry has moved beyond studying ergonomics and discreet processes. In fact, Leveson says that researchers have put a microscope to cognitive functions, psychology, cultural issues, and a variety of other components that comprise human factors. "They have evolved toward a system view and worked to understand how everything—hardware, software, procedures, and humans—interact. It's a model that other industries must embrace," she says.

One thing is certain: Automation disconnects won't disappear anytime soon. Leveson believes that, ultimately, the people designing systems must take a more holistic view and get past the notion that when a problem or breakdown occurs it's a result of "human error." She believes that universities must place a greater focus on human factors and that programmers and others must understand that, without a big picture view of what they are building, the end result will continually fall short.

Others, such as Dekker, argue that society must examine larger issues, including whether automation automatically translates into progress. "In reality, not every function or process is best automated," he says. "In some cases, automation simply creates new or different tasks and doesn't provide any real benefit." Automation may also change processes to the point where people are more confused and entirely new social dynamics take place. At that point, he says, designers may attempt to add new features, which only ratchet up confusion and complexity further.

To be sure, imperfect people continue to build imperfect systems. The need to focus on human-machine interfaces has never been greater. "Designers, engineers, programmers, and others must take an expansive view of automation and understand all the possibilities and variables," concludes Norman. "Only then can we build systems that improve performance and solve real-world problems."

Back to Top

References

* Further Reading

Bainbridge, L. Ironies of automation. New Technology and Human Error, J. Rasmussen, K. Duncan, J. Leplat (Eds.). Wiley, Chichester, U.K., 1987.

Dekker, S. The Field Guide to Understanding Human Error. Ashgate Publishing, Farnham, Surrey, U.K., 2006.

Dekker, S. The Field Guide to Human Error Investigations. Ashgate Publishing, Farnham, Surrey, U.K., 2002.

Norman, D. The Design of Future Things. Basic Books, New York, 2009.

Sarter, N. B., Woods, D. D., and Billings, C. E. Automation surprises. Handbook of Human Factors and Ergonomics (3rd ed.). Wiley, New York, 2006.

Back to Top

Author

Samuel Greengard is an author and freelance writer based in West Linn, OR.

Back to Top

Footnotes

DOI: http://doi.acm.org/10.1145/1610252.1610261

Back to Top

Figures

UF1Figure. The June 22, 2009 Metro subway train crash in Washington, D.C., in which nine people died and 80 were injured, is the deadliest accident in the Metro's 33-year history.

Back to top


©2009 ACM  0001-0782/09/1200  $10.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2009 ACM, Inc.


 

No entries found