News
Architecture and Hardware

Computing Touch

Posted
Researchers at the University of Houston have developed stretchable electronics that can serve as an artificial skin for robots and prosthetic limbs..
Research groups across the globe are working to engineer the sense of touch, both for humans who have lost limbs, and for robots.

Research groups across the globe are working to engineer the sense of touch, both for humans who have lost limbs, and for robots, which have never registered touch before.

"After many years, I felt my hand, as if a hollow shell got filled with life again," says an anonymous amputee who manipulated a robotic hand enveloped in artificial skin created by researchers from Johns Hopkins University and the National University of Singapore.

The Johns Hopkins/Singapore research team created that sensation through the use of artificial skinmade of fabric and rubber and laced with sensors to mimic nerve endings , which works by sensing stimuli and relaying those impulses back to the amputee's real skin.

The research team was able to recreate those sensations by focusing only on restoring the feelings of pressure and pain, according to Luke E. Osborn, who is pursuing his Ph.D. at Johns Hopkins in the Department of Biomedical Engineering.

The research team was able to recreate those sensations by focusing on restoring the feelings of pressure and pain—and the continuum of feelings between the two—according to Luke E. Osborn, who is pursuing his Ph.D. in the Department of Biomedical Engineering of the Johns Hopkins University.

In practice, the artificial skin works by relying on sensors designed to sense pressure or pain. If the artificial skin on a prosthetic finger is pricked by a pin, for example, its sensors pick up that signal and send it along to a controller, which includes a software representation of what that pain 'feels' like. Now fully defined by the controller, the pain signal is sent on to a stimulator, which in turn routes the pain signal to an electrode affixed to the living, human skin on the arm of the amputee. The result: when the prosthetic finger encounters a pinprick, the human brain interprets the event as an actual pinprick.

"It's worth pointing out that the transformation of the signal from the sensor to the stimulator/person isn't either pressure or pain," Osborn says.  "It is a continuous range of sensations that go from pressure or pain.  So basically, when the prosthesis grabs an object and the person feels something it isn't they feel only either (A) pressure or (B) pain. They can feel some range of sensations that fall between these two points. They can of course feel just a pressure or just a pain.  But the way the software in the prosthesis works is that enables a continuous range of sensations between pressure and pain."

Osborn says the researchers spent months mapping out the residual limb (living skin) of the amputee "to figure out which spots, when stimulated, would cause 'sensation' in the phantom hand. In this case, there were three spots on the residual limb of the amputee that, when we stimulated, would consistently activate regions of either the thumb/index finger, pinky/ulnar region, or wrist of the phantom hand."

"This is interesting and new, because now we can have a prosthetic hand that is already on the market and outfit it with an e-dermis that can tell the wearer whether he or she is picking up something that is round or whether it has sharp points."

The progress of the Johns Hopkins/University of Singapore team "can truly make a difference to people's quality of life," says Marianna Obrist, professor of multisensory experiences at the University of Sussex in the U.K.  "Bringing back the sense of touch to people who are missing limbs demonstrates the advances in our understanding of the sense of touch, as well as novel engineering solutions."

Ravinder Dahiya, professor of electronics and nanoengineering at Scotland's University of Glasgow, says he sees real value in the teams' decision to replicate the sensation of pain, because the sensation of pain can help amputees see an artificial limb as a true extension of their bodies. "Gaining ownership — thinking of artificial hands as part of body — is critical for the acceptance of artificial limbs by amputees in their daily life."

Other research groups across the globe dedicated to one day completely replicating the sense of touch, for both humans and robots.

Obrist's Sussex Computer Human Interaction Lab, for example, is attempting to replicate aspects of touch by studying the manipulation of haptic perceptions along with the relationship between haptics, psychophysics and high cognitive functions.

Meanwhile, Dahiya says he and his team at the University of Glasgow are looking to develop artificial skin that can cover a large area of the body and can replicate some aspects of touch.

In addition, researchers at the University of Chicago are working to replicate nerve responses in limbs, a system that figures into the overall experience of touch.  "We have developed a simulation that can reproduce, with millisecond precision, the response of every nerve in the arm," says Sliman Gensmaia, an associate professor at the University of Chicago.  "This model instructs how to convert patterns of activation of the sensors into patterns of stimulation of the nerve."

Still other work in computerized touch is underway at Stanford University, the University of Houston, and the University of Pisa in Italy.

Even with all this activity, a system that completely replicates every facet of the sense of touch is a long way off.  "There is still more to be understood in order to capture all the facets of 'touching' and experiencing the world and our surrounding through our sense of touch," says Sussex's Obrist. "The power of touch goes well beyond the crude sensory integration; a problem that, by itself, still presents many mysteries to modern scientists."

In the view of the University of Chicago's Gensmaia, "The main bottleneck is not in the sensor, but in the neural interface itself.  We have a detailed understanding of how the nerve responds to skin stimulation, but we are not yet in a position to create the complex patterns of nerve activation that are produced during interactions with objects."

"I think eventually we will be able to replicate the sense of touch to a greater degree," says University of Glasgow's Dihiya. "The rapid technological advances give this confidence. For example, we are now able to develop skin on flexible and conformal substrates.

"However, we still need to think about how to handle the large data generated by the skin, about powering large number of sensors etc.  Skin is a complex organ with several type of receptors embedded in soft materials and ensemble of receptors partially processing the tactile data. This means we could consider skin as a soft, flexible and sensitive computer with distributed computing capability."

Adds Singapore's Dragomir, "If we didn't think that it is possible, we would probably not be here."

Joe Dysart is an Internet speaker and business consultant based in Manhattan, NY, USA. 

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More