Ever since the first silent-mode cell phones started buzzing in our pockets a few years ago, many of us have unwittingly developed a fumbling familiarity with haptics: technology that invokes our sense of touch. Video games now routinely employ force-feedback joysticks to jolt their players with a sense of impending onscreen doom, while more sophisticated haptic devices have helped doctors conduct surgeries from afar, allowed deskbound soldiers to operate robots in hazardous environments, and equipped musicians with virtual violins.
Despite recent technological advances, haptic interfaces have made only modest inroads into the mass consumer market. Buzzing cell phones and shaking joysticks aside, developers have yet to create a breakthrough producta device that would do for haptics what the iPhone has done for touch screens. The slow pace of market acceptance stems partly from typical new-technology growing pains: high production costs, the lack of standard application programming interfaces (APIs), and the absence of established user interface conventions. Those issues aside, however, a bigger question looms over this fledgling industry: What are haptics good for, exactly?
Computer scientists have been exploring haptics for more than two decades. Early research focused largely on the problem of sensory substitution, converting imagery or speech information into electric or vibratory stimulation patterns on the skin. As the technology matured, haptics found new applications in teleoperator systems and virtual environments, useful for robotics and flight simulator applications.
Today, some researchers think the big promise of haptics may involve moving beyond special-purpose applications to tackle one of the defining challenges of our age: information overload. For many of us, a growing reliance on screen-based computers has long since overtaxed our visual senses. But the human mind comes equipped to process information simultaneously from multiple inputsincluding the sense of touch. "People are not biologically equipped to handle the assault of information that all comes through one channel," says Karon MacLean, a professor of computer science at the University of British Columbia.
Haptic interfaces offer the promise of creating an auxiliary information channel that could offload some of the cognitive load by transmitting data to the human brain through a range of vibrations or other touch-based feedback. "In the real world things happen on the periphery," says Lynette Jones, a senior research scientist at Massachusetts Institute of Technology. "It seems like haptics might be a good candidate for exploiting that capability because it's already a background sense."
As people consume more information on mobile devices, the case for haptics seems to grow stronger. "As screen size has become smaller, there is interest in offloading some information that would have been presented visually to other modalities," says Jones, who also sees opportunities for haptic interfaces embedded in vehicles as early warning systems and proximity indicators, as well as more advanced applications in surgery, space, undersea exploration, and military scenarios.
While those opportunities may be real, developers will first have to overcome a series of daunting technical obstacles. For starters, there is currently no standard API for the various force feedback devices on the market, although some recent efforts have resulted in commercial as well as open source solutions for developing software for multiple haptic hardware platforms. And as haptic devices grow more complex, engineers will have to optimize for a much more diverse set of sensory receptors in the human body that respond to pressure, movement, and temperature changes.
As the range of possible touch-based interfaces expands, developers face a further hurdle in helping users make sense of all the possible permutations of haptic feedback. This lack of a standard "haptic language" may prove one of the most vexing barriers to widespread market acceptance. Whereas most people have by now formed reliable mental models of how certain software interfaces should workkeyboards and mice, touchpads, and touch screens, for examplethe ordinary consumer still requires some kind of training to associate a haptic stimulation pattern with a particular meaning, such as the urgency of a phone call or the status of a download on a mobile device.
The prospect of convincing consumers to learn a new haptic language might seem daunting at first, but the good news is that most of us have already learned to rely on haptic feedback in our everyday lives, without ever giving it much thought. "We make judgments based on the firmness of a handshake," says Ed Colgate, a professor of mechanical engineering at Northwestern University. "We enjoy petting a dog and holding a spouse's hand. We don't enjoy getting sticky stuff on our fingers." Colgate believes that advanced haptics could eventually give rise to a set of widely recognized device behaviors that go well beyond the familiar buzz of cell phones. For now, however, the prospect of a universal haptic language seems a distant goal at best.
"Until we have a reasonably mature approach to providing haptic feedback, it's hard to imagine something as sophisticated as a haptic language arising," says Colgate, who believes that success in the marketplace will ultimately hinge on better systems integration, along the lines of what Apple has accomplished with the iPhone. "Today, haptics is thought of as an add-on to the user interface," says Colgate. "It may enhance usability a little bit, but its value pales in comparison to things you can do with graphics and sound. In many cases, the haptics is so poorly implemented that people turn it off pretty quickly. And that's not to criticize the developers of hapticsit's just a tough problem."
As haptic devices grow more complex, engineers will have to optimize for a much more diverse set of sensory receptors in the human body that respond to pressure, movement, and temperature changes.
Many efforts to date have used haptics as a complementary layer to existing screen-based interfaces. MacLean argues that haptics should do more than just embellish an interaction already taking place on the screen. "A lot of times you're using haptics to slap it on top of a graphical interaction," she says. "But there can also be an emotional improvement, a comfort and delight in using the interface."
Led by Ph.D. candidate Steve Yohanan, MacLean's team has built the Haptic Creature, a device about the size of a cat that simulates emotional responses. Covered with touch sensors, the Haptic Creature creates different sensationshot, cold, or stiffening its "ears" in response to human touch. The team is exploring possible applications such as fostering companionship in older and younger people, or treating children with anxiety disorders.
MacLean's team has also developed an experimental device capable of buzzing in 84 different ways. After giving users a couple of months to get familiar with the feedback by way of an immersive game, they found that the process of learning to recognize haptic feedback bore a great deal of similarity to the process of learning a language. "The surprising thing is that people are able to quickly learn an awful lot and learn it without conscious attention," says MacLean. "There's a lot of potential for people to learn encoded signals that mean something not in a representational way but in an abstract way without conscious attention."
To date, most low-cost haptic interfaces have relied exclusively on varying modes of vibration, taking advantage of the human skin's sensitivity to movement. But vibration constitutes the simplest, most brute-force execution of haptic technology. "Unfortunately," says Colgate, "vibration isn't all that pleasing a sensation."
Some of the most interesting research taking place today involves expanding the haptic repertoire beyond the familiar buzz of the vibrating cell phone. At MIT, Jones' team has conducted extensive research into human body awareness and tactile sensory systems, examining the contribution of receptors in the skin and muscles to human perceptual performance. In one study, Jones demonstrated that users were unable to distinguish between two thermal inputs presented on a single finger pad; instead, they perceived it as a single stimulus, demonstrating the tendency of thermal senses to create "spatial summation" rather than fine-tuned feedback.
Colgate's research has focused on a fingertip-based interface that provides local contact information using new actuation technologies including shear skin stretch, ultrasonic, and thermal actuators. By varying the friction in correspondence with fingertip motion across a surface, the interface can simulate the feeling of texture or a bump on the surface. Compared with force-feedback technology, vibrotactile stimulators, known as tactors, are much smaller in size and more portable, although high-performance tactors with wide bandwidths, small form factors, and independently controllable vibrational frequency and amplitude are still hard to come by at a reasonable cost.
The Northwestern researchers have figured out how to make transparent force sensors that can capture tactile feedback on a screen, so that they can be combined with a graphical display. "My ideal touch interface is one that can apply arbitrary forces to the finger," says Colgate, whose team has been approaching the problem by combining friction control with small lateral motions of the screen itself.
By controlling the force on the finger, the system can make parts of the screen feel "magnetic" so that a user's finger is pulled toward themup, down, left, rightor letting a user feel the outline of a button on the screen where none exists. Colgate's team is also exploring how to develop devices using multiple fingers, each on a different variable friction interface.
Looking ahead, Colgate believes the evolution of haptic interfaces may follow the trajectory of touch screens: a technology long in development that finally found widespread and relatively sudden acceptance in the marketplace. "The technology has to be sufficiently mature and robust, there has to be an active marketplace that creates competition and drives down costs, and it has to meet a real need."
As production costs fall and new standards emergeas they almost certainly willthe marketplace for touch-based devices may yet come into its own. Until that happens, most of the interesting work will likely remain confined to the labs. And the future of the haptics industry seems likely to remain, well, a touchy subject.
Chubb, E.C., Colgate, J.E., and Peshkin, M.A.
ShiverPaD: a glass haptic surface that produces shear force on a bare finger, IEEE Transactions on Haptics 3, 3, JulySept., 2010.
Ferris, T.K. and Sarter, N.
When content matters: the role of processing code in tactile display design, IEEE Transactions on Haptics 3, 3, JulySept, 2010.
Jones, L.A. and Ho, H.-N.
Warm or cool, large or small? The challenge of thermal displays, IEEE Transactions on Haptics 1, 1, Jan.June, 2008.
Putting haptics into the ambience, IEEE Transactions on Haptics 2, 3, JulySept., 2009.
Ryu, J., Chun, J., Park, G., Choi, S., and Han, S.H.
Vibrotactile feedback for information delivery in the vehicle, IEEE Transactions on Haptics 3, 2, AprilJune, 2010.
©2011 ACM 0001-0782/11/0100 $10.00
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2011 ACM, Inc.
No entries found