As drones have flown off the drawing boards and into the skies, it has become clear most of these devices and systems emulate the way existing aircraft fly, including airplanes and helicopters. They rely on the basic principles of thrust, lift, and drag to move through the air and maneuver into soft landings. However, as researchers and scientists work to develop next-generation drones that can fly in an automated or autonomous fashion to handle everything from delivering packages to inspecting buildings and bridges, their focus is on how to improve flight systems and software to navigate a complex world.
Not surprisingly, scientists are increasingly turning to nature for inspiration. Some are taking a close look at how birds fly, while others are developing systems inspired by insect eyes. The common denominator? "It's difficult to say exactly how drones will evolve and what specific systems they will use to fly more effectively, but obstacle avoidance is a very important part of the picture," says Eric Johnson, Lockheed Martin Associate Professor of Avionics Integration at the Georgia Institute of Technology School of Aerospace Engineering.
One researcher with an eye on nature is David Lentick at Stanford University. The assistant professor of mechanical engineering, along with a team of researchers, studies how hummingbirds, swifts, and other birds fly and avoid obstacles. He points out that drones encounter the same problems birds have encountered for millions of years. Lentick puts high-speed video to work to understand the biomechanics of vortex dynamics, fluid-structure interaction, and other factors. "We focus on key biological questions which we probe with new engineering methods to find inspiration for innovative flying robots."
The inspiration is taking shape. One of the team's robotic devices, the DelFly, can stay airborne in autonomous flight mode for about 10 minutes. It emulates the flight pattern of a hummingbird and can accomplish many of the same movements, including darting forward, backward, and sideways. Another invention, the RoboSwift, is a micro-airplane with wings that fold back and change shape in order to move at greater and lesser speeds efficiently. The device flies up to 60 percent further and faster by using this unique wing design, Lentick says.
Another researcher, Steven Wiederman, head of the Visual Physiology & Neurobotics Laboratory at the University of Adelaide School of Medicine, is studying the way insects see and track their prey, and using the resulting data to build robot visual systems, including drones. "Our interest is in developing autonomous systems — drones that will be able to distinguish and select individual features within cluttered environments," he says. Wiederman relies on electrophysiological techniques, such as recording activity from individual 'target-detecting' brain cells in dragonflies, to better understand autonomous behaviors, including collision avoidance, pursuit, and other actions.
The Adelaide team is focused on translating biological algorithms to robotic devices. Among other things, this requires pulling data from drone cameras — referred to as an "active vision system" — and sending it to a Mac Mini computer that runs a real-time simulation in MATLAB virtual reality software. The resulting data is ported to C++ and then used on the robotic device. The goal, Wiederman says, is to develop electronic eyes that can distinguish between background distractions and targets and, as a result, fly more precisely. "We hope the insights gained from these electrophysiological investigations will lead to the development of robust feature discrimination in unstructured environments within the next five to 10 years," Wiederman says.
Today, drone flight systems mostly rely on global positioning systems (GPS), gyroscopes, pressure sensors, and accelerometers. Future systems, which would operate in an autonomous mode rather than requiring an operator, will require more and better sensors, as well as more sophisticated algorithms, Johnson points out. "We also are seeing a few systems that rely on cameras to detect motion and how the vehicle is moving in relation to the ground, or how far it is above the ground," he says. Along the way, he believes reliance on GPS will wane because it is relatively easy to jam and it does not work indoors.
"The key is that autonomous drones must fly in difficult places and situations — and the designs and flight systems must incorporate automated obstacle avoidance." Johnson says. There will also be a need for fault tolerance; the systems must be able to manage human and machine errors, and environmental factors ranging from wind and rain to fog and smoke. "Vehicles must be able to fly, navigate, or adapt if [the vehicle] or the environment is behaving in an unexpected way. This includes adaptive controls that will actually change the behavior of the vehicle — and the way it flies — over time," Johnson says.
Samuel Greengard is an author and journalist based in West Linn, OR.
No entries found