News
Architecture and Hardware

A New Sense for ­nderwater Robots

Posted
Maarja Kruusmaa inspects a FILOSE robotic fish.
Marrja Kruusmaa and her colleagues at the Center for Biorobotics at the Tallinn University of Technology in Estonia have developed a way for underwater robots to better understand their environment.

Traditionally, underwater robots/drones are bulky, unintelligent, and sluggish; they sense their environment with sound via sonar or by sight via a camera, but that often gives them only a limited underwater view.

Maarja Kruusmaa, founder and director of the Center for Biorobotics at the Tallinn University of Technology in Estonia, has endowed underwater robots with a completely new sense: the artificial lateral line, an electronic organ that enables her lab's underwater robots to extract information from the water around it, and to act on it. "Just like robots on the land can map a landscape, our robots map a flowscape underwater," says Kruusmaa. "The flowscape gives the robot precise information about the pressure, pressure differences, and the acceleration of the flow."

How is this new sense organ inspired by nature?

Fish have lateral line organs, which run along the middle of their bodies. They sense the local flow and use it for schooling behavior, predation, and for orientation. It has taken biologists a surprisingly long time before they had scientifically proven the function of this lateral line. At the beginning they thought it was just for secreting slime, but now we know it is one of the most advanced flow sensing systems to have ever evolved. This recognition happened in the 1960s.

Everything that moves in water leaves a wake behind. Thanks to the lateral line, even blind fish can sense objects in their vicinity. Also, the lateral line helps fish to find 'sweet spots' in the flow that are energetically favorable, so they can swim upstream with less energy. This organ has been the inspiration for the lateral line sensor in many of our underwater robots.

How do you utilize this new sense organ in robotic applications?

We have built some versions of our robotic fish, FILOSE, equipped with a lateral line sense organ. Now we use them on fish-shaped sensors to find the best way for real fish to bypass hydroelectric dams. Specially designed fish passes often don't work, and often it is unclear what bottlenecks in the flow look like from the fish's point of view. Our robotic fish can autonomously sense the flow and find which places the fish like and which they don't like.

We are now taking a completely different road, by doing away with the body entirely. It is also possible to measure on a large scale with giant lateral lines measuring hundreds of meters. We recently submerged just such an array in Tallinn Bay (in Estonia); here it can be used to passively identify and track ships. Like an enormous flat fish lurking at the bottom, it feels the waves caused by passing ships.

We can use this to study erosion on the sandy bottom. Furthermore, giant lateral lines can be used to improve harbor safety. The new generation of massive ships are guided into a harbor by smaller pilot ships; these harbor pilots receive precise information about the weather, but they have almost no information about local currents. Our lateral line arrays can be installed all over the harbor and networked into a large artificial nervous system. This information can make it safer for big ships to maneuver into and out of a harbor.

Which other bio-inspired underwater robots have you built?

We also built some robots that do not have lateral line sensors and that are specially designed not to disturb the underwater environment. For example, we built a kind of turtle-robot that is small and uses four flippers; we used it to investigate the remains of a 17th century fortification in Tallinn Bay. The robotic turtle gets into places no human diver can; it is highly maneuverable, and ideal for doing underwater archaeology.

We also used it in Svalbard (Norway, in the Arctic Ocean) to study underwater biology during polar nights. People thought there was not much going on during a polar night, but our robot showed that even the polar night is full of underwater life; you just have to find a way to capture it in action.

What are the remaining challenges in underwater robotics?

For me, there are two major challenges.

The first one is autonomy. For underwater robots to become more autonomous, they have to consume as little energy as possible. My ultimate goal is to build a robot that uses no battery at all and harvests all its energy from the flow; then, we would even beat nature.

The second part of the autonomy challenge is the intelligence of the robot; we have to use the information coming from the sensors more cleverly. How do we overcome the noise in the signal? How do we deal with obstacles? What if the visibility is bad? How do we build better situational awareness?

And the second major challenge?

That is maneuvering in shallow waters.

A quarter of all people on earth live along a coastline, so close to shallow water. People often think that it is a challenge for a robot to go far and deep, but that's not true. For an underwater robot, shallow water is much more challenging than the deep sea. Visibility is worse. Waves and currents are more unstable. Because of the shallowness the robot has to be small, but then it is also more easily disturbed. Actually, sending a robot to Mars is easier than sending a robot underwater.

Bennie Mols is a science and technology writer based in Amsterdam, the Netherlands.

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More