News
Architecture and Hardware

Simpifying Machine-based Touch Sensing

Posted
Teaching robots to touch and feel.
Researchers have developed a flexible, touch-sensitive artificial nervous system that could provide a sense of touch to robots and prosthetics.

In 1791, Italian physicist and anatomist Luigi Galvani amazed contemporary society by announcing he had brought severed frogs' legs back to life—famously setting their dead limbs twitching by probing their nerves with different metals. His findings ultimately sparked the invention of the battery, and spawned the electrical and electronics revolution that brought you the device upon which you are reading this.

More than two centuries later, critters' disembodied legs are once again twitching in a lab to demonstrate a strange effect. A research paper published in Science announced that a severed cockroach leg has been coaxed into twitching inside a Stanford University laboratory every time the plastic skin of a flexible, touch-sensitive artificial nervous system is touched.

The harder the skin is pressed, the more the roach's leg moves. However, this is not some bizarre biotech demonstration; it's a test of a system designed to give robots, and people wearing prostheses, something the rest of take utterly for granted: an effortless sense of touch.

Developed by an international team led by Stanford electronics engineers Zhenan Bao and Yeongin Kim, the biologically inspired system sets out to simplify machine-based touch sensing. Instead of having robots and prostheses laden with heavy, power-draining centralized computers to crunch data acquired by pressure sensors, they aim to do it more like the way nature does it.

As Bao, Kim, and their colleagues explain in their paper, the idea is to develop touch-receptive skin whose signals are processed locally by analogs of neurons, which in turn fire signals along channels that behave like analogs of synapses. In a prosthesis, these signals then would enter the residual nerve fibers of amputees to actuate muscles; in a robot, they would help the droid learn about what it is touching.

The aim of the researchers, using what Stanford calls a 'neuromorphic' approach, is eventually to provide tactile sensations to people wearing hand and arm prostheses. In the short term, they hope to give robots the ability to perceive the consistency, texture, and shape of objects in their environments through touch, just as humans do.

Touch is a capability of robots that really needs improve if they are to adopt the roles expected of them, particularly in healthcare and rehabilitation work, says Kerstin Dautenhahn, incoming Chair of Intelligent Robotics at the University of Waterloo in Ontario, Canada. "Generally, a sense of touch is very, very important for robots performing physical manipulation; they have to be able to tell an egg from a brick," she says.

"In a healthcare situation, for instance, when a robot might have to turn a patient in bed, you don't want it to bruise or injure them. In something as seemingly trivial as shaking hands, they must not squeeze too hard and must be sensitive, especially if it is a child or older person," Dautenhahn says. "So sensing touch precisely and acutely is key. It's quite an elaborate task, and we don't yet have  robust ways to do it, so we need better, cheaper, lower-power and more robust ways to sense touch."

That's precisely the kind of capability the Stanford team is hoping to create with its flexible artificial nervous system—which it developed in conjunction with materials scientists at Seoul National University in South Korea, and thin-film transistor researchers at Nankai University in Tianjin, China.

The Stanford technology is comprised of three essential chunks: a resistive pressure sensor; a neuronal computation circuit that makes decisions about the touch event, and a synaptic transmission channel. Crucially, all are based on so-called 'organic' electronics, a catch-all term for circuitry based on soft, polymer-based components that can be inkjet-printed on a single flexible elastomer base.

Stanford's resistive pressure sensor consists of a flexible array of receptors that can register a wide range of contact pressures, from a barely noticeable brush to a heavy push. It has a flat carbon-nanotube-based electrode on one side of an elastomer film, and a gold electrode on the other side; the density of the deposited gold particles defines the artificial skin's touch resolution, says Kim.

Different touch instances dynamically change the electrical resistance between the carbon and gold layers at different points, creating intermittent signals that are fed to the neuronal layer in the rubbery artificial nerve. The neuron circuits are designed to fire once they have collected enough information about the current touch episode to make a decision about, say, the texture, direction of motion, slip, or the softness of the surface that is touching the sensor.

Next, that electrical signal has to be relayed by a synapse-mimicking transistor, which sends it in the same format as human nerves do, either to a robot's feedback system, allowing it to react with a reflex reaction to what it has touched, or into a residual nerve of, say, a prosthesis wearer.

To show this signal injection into nerves works, the researchers demonstrated it on a cockroach. "Stimulating motor neurons and muscles has been done for centuries, but that involved only voltage generation," Kim says of experiments like Galvani's. "We wanted to show that our artificial sensory nerves are biocompatible and can replace part of a biological nervous system."

In another key lab test, the artificial nerve has been used successfully to read the tiny bumps of Braille, the tactile writing system for the visually impaired. Bao sees this as a key achievement that proves the technology's worth: "We take our skin for granted, but it's a complex sensing, signaling and decision-making system. The artificial sensory nerve is a step toward making skin-like sensory neural networks for all sorts of applications," she says.

Because their artificial nerve processes touch sensing in a biomimetic way, the researchers  hope roboticists use it to "generate reflexes that help a robot move more like an animal or a human," says Kim.

Observers are impressed by the early-stage technology's prospects.

In a commentary on the Stanford paper, Chiara Bartolozzi, a humanoid robotics specialist at the Italian Institute of Technology in Genoa, Italy, said Stanford's artificial nerve should ultimately be easy to manufacture and deploy. "The proposed system exploits organic electronics that allows for 3D (three-dimensional) printing of flexible structures that conform to large curved surfaces, as would be required for placing sensors on robots and prostheses," she says.

That means roboticists could 3D-print intricately shaped sensing skins to secure on top of a robot's arm or hand, and print the neural and synaptic circuitry of the correct shape to install inside the nooks and crannies within the limbs. In this way, the biomimetic simplicity of the approach gains physical, as well as computational, advantages.

In terms of improvement in the types of human-robot interaction sought by Dautenhahn, printable nerve technology might indeed "mean easier deployment of tactile capabilities in robots," says human-robot interaction (HRI) researcher Thomas Arnold at Tufts University in Medford, Massachusetts. At the annual ACM Human Robot Interaction conference (HRI 2018)  in Chicago, in March, Arnold and colleague Matthias Scheutz unveiled research that said that, in general, robots that touch people in the course of working with them make the machines seem more engaging, more attentive, and more of a teamworker.

However, that does not give carte blanche for new breeds of tactile robots to get touchy-feely, Arnold warns, since just as with human-human interaction, there are norm for what is appropriate, vs. inappropriate, touching of others. 

"Designing robots for touch will entail more than the technical challenges of subtle, sensitive physical contact," Arnold says. "Tactile robots will be judged in light of moral and social norms as well."

Paul Marks is a technology journalist, writer, and editor based in London, U.K.

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More