News
Architecture and Hardware

Welcome the Plants That Move on their Own

Posted
Vincross' plant-robot hybrid.
Researchers are creating cybernetic life-forms that incorporate plants with robotics, which could be used for a variety of applications.

Robotics entrepreneurs are designing interfaces that enable everyday houseplants to move on wheels, amble on robotic crab legs, and more.

Researchers at the Massachusetts Institute of Technology (MIT), for example, have designed an interface that enables a houseplant atop computerized wheels to command those wheels to move it towards light.  The result is a startling display in which the wheeled houseplant moves towards a light in a few seconds, rather than over the course of a few days, when that light is switched on.  

The researchers created the 'cybernetic life-form' by attaching electrodes to the plant to monitor its chemical reactions and enable it to signal to the computerized wheels to move when it detects light.

"The agency of such movements rests with the plant," says Harpreet Sareen, an assistant professor at the Parsons School of Design who developed the experiment with MIT professor Pattie Maes.

Similar research is under way by Sun Tianqui, founder of Beijing-based robotics company Vincross who engineered a plant atop six robotic legs that crawls towards a light source (and crawls away from the light when it has had its fill).

Tianqui says he developed motions for the crab-like plant robot using Vincross' MIND SDK.  "In fact, for many of the motions, I only need to call the motion library of the MIND SDK, so it is very simple to develop," Tianqui says. He also used a distance sensor, accelerometer, soil moisture sensor, and 720p camera.

"Originally, I did not plan to commercialize," Tianqui says, "but based on the interest I'm hearing, I see that people are interested in adding robotics into their homes with a product such as this. So maybe we better get building."

Yet another research project in the same space is flora robotica, which is based by the European Union's Horizon 2020 research and innovation program, and is coordinated by Heiko Hamman, a professor of service robotics at Germany's University of Luebeck. That ongoing experiment uses a combination of sensors and computers to grow plants into shapes dreamed up by the researchers.  The purpose of the project is to investigate symbiotic relationships between robots and plants, "and to explore the potentials of a plant-robot society able to produce architectural artifacts and living spaces," according to the project website.

Hamman says the team uses an everyday Raspberry Pi single-board computer programmed with Python to pull-off its plant/machine collaboration.  "In addition, we use a number of state-of-the-art machine learning tools, such as LSTM networks and neuroevolution to automatically generate data-driven holistic plant models and robot controllers."

With MIT's 'houseplant-on-wheels,' Sareen says one of the greatest challenges was to translate the plant's bioelectric reaction to light, given that the reaction only generates an electric signal of a few millivolts. The solution: "We use two levels of operational amplifiers and instrumental amplifiers to make these signals detectable, along with hardware signal filtering to isolate electrical spikes.

"We also isolate the plant for a day, not varying anything in terms of external environment, and then only change the light. This isolation helps us get a single signal which can be co-related to change of light."

Sareen and Maes are now looking into developing a commercial kit based on their research, which scientists and others could use to manipulate plant electrochemical reactions. In the longer term, they'd like to develop a variety of commercial kits that would capture and manipulate a full spectrum of electrochemical signals naturally generated by plant and animal life.

"It is a piece of thought-provoking work," says Ping Ma, an assistant professor of statistics at the University of Georgia, of Sareen and Maes' work.  "The research manifests a perfect merger of natural intelligence" and machine technology.

Manoj Karkee, an associate professor in the Biological Systems Engineering Department of the Center for Precision and Automated Agricultural Systems at Washington State University, agrees: "This work is intuitive and unique, something that has a huge potential for commercial adoption."

Computer scientists and others working in the space anticipate attempts to combine plant life with machines in many different applications. 

Washington State's Karkee, for example, envisions scientists leveraging plant robotics for practical purposes, such as employing robots to sense when new branches emerge on fruit trees.

Meanwhile, the University of Luebeck's Hamman says his team is looking into practical ways to commercialize flora robotica, such as harnessing the technology to grow plants on the walls of buildings, and away from doorways and windows. "Another idea could be to target for the consumer market with a device that grows plants in your living room without you having to care for it.  You could even define the direction to which it should grow or not grow."

"We will see robotics fused with many different subparts of science to facilitate our daily lives and help build a sustainable community," says Laura Blumenschein, a Ph.D. candidate with the Collaborative Haptics and Robotics in Medicine (CHARM) Lab at Stanford University, agrees:  "I think that before long, we will have robots that can grow useful structures on demand or that can 'live' with plants in order to monitor soil condition, weather, and other phenomena over long stretches of time."

Adds Kasper Stoy, a robotics and embodied artificial intelligence researcher serving as associate professor in the Software and Systems Section of the IT University of Copenhagen, "I think it would be interesting to see technology forming a symbiotic relationship with each other to extend the lifetime of both and the usefulness of both."

Josh Bongard, an associate professor in the College of Engineering and Mathematical Sciences of the University of Vermont, sees the ultimate rendering of plant with machine incorporating many of the most invigorating advances in computer science of late. "Whatever the 'killer app' in five years' time will be, it will probably be some combination of the cloud, brain-computer interaction, and robots fused with organic material," Bongard says.  

"Life grows exceedingly complex, yet robust forms from simple seeds, or eggs, while machines can do certain tasks very, very quickly and accurately," he adds. "Discovering how to combine these two systems to create hybrid systems that are better than either alone is a daunting yet exciting prospect."

Equally hopeful about plant robotics R&D is University of Luebeck's Hamman, although he is more conservative on how quickly the tech will develop. "In a short period of five years, I wouldn't expect too much change," Hamman says. "We are still at the beginning; we have just created the basic methodology."

Still, says University of Vermont's Bongard, "It's only a few short steps from plants and robots to deep philosophical questions about the nature of thought."

Joe Dysart is an Internet speaker and business consultant based in Manhattan, NY, USA. 

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More