As drones have matured into smarter and more practical machines, they have hummed, buzzed, and whirred their way into industries as diverse as movie production, agriculture, civil engineering, and insurance. It is entirely clear that autonomous drones will play a prominent role in business in the coming years. Firms such as Amazon, FedEx, and Uber have experimented with the technology to deliver packages, food, and more, while military agencies, emergency responders, gaming companies, entertainment firms, and others have explored other possibilities.
“Drones introduce far more efficient ways to accomplish some tasks,” says Todd Curtis, president of Airsafe. com, a site that tracks drone and other aeronautic technologies.
Powering more advanced drones are more sophisticated on-board sensors and processors, better artificial intelligence (AI) algorithms, and more advanced controllers and communication systems. In addition, engineers are packing greater numbers of sensors into drones—and using them in different combinations—to create greater “awareness” of the surrounding environment. This sensing, when combined with GPS and other navigation capabilities, allows drones to tackle more advanced autonomous tasks, including devices that explore caverns or other hard-to-reach spaces, as well as underwater drones that conduct research by scanning oceans.
Yet, despite rapidly evolving capabilities, it also is clear that autonomous drones have not completely mastered the art and science of navigating and accomplishing their designated task. Buildings, birds, power lines, trees and people remain formidable obstacles for autonomous Unmanned Aerial Vehicles (UAVs), as they are known. Fog, snow, smoke, and dust present additional challenges.
It is one thing to showcase a drone in a controlled environment; it is quite another to have it operate flawlessly in the wild. UAVs must have near-perfect vision and sensing, as well as the ability to navigate areas where satellite and communications signals cannot reach and need backup and fail-safe systems that can take control of the drone if/when something goes astray.
“We are seeing remarkable advances in onboard sensing and processing, but also the use of far more sophisticated AI (artificial intelligence) algorithms in drones,” says Nathan Michael, associate research professor at the Robotics Institute of Carnegie Mellon University. “These navigation and control systems are moving drones beyond the basic ability to fly from Point A to Point B. They’re making it possible for drones to understand the world around them and make complex decisions in real time.”
Drones Take Flight
Engineering a fully autonomous drone is rife with challenges—particularly in busy and complex urban areas.
First, they are not like the autonomous vehicles that operate on land. UAVs have extreme space and weight restrictions. Whereas a car can potentially have dozens, even hundreds, of sensors mounted across its surface, a drone can accommodate the weight of only a few.
Second, UAVs move in almost every direction in a three-dimensional (3D) space, while a motor vehicle operates on a two-dimensional plane. This makes designing software and algorithms for UAVs exponentially more complex.
Finally, the simple fact these machines are suspended in the air and constantly moving introduces additional challenges and risks.
Today, most UAVs operate on a line-of-sight basis. Essentially, a person uses a transmitter, typically operating in the 2.4GHz frequency band, to communicate with and control the drone’s onboard computer. However, for drones to become truly autonomous, operate at high speeds, and ultimately become a commercially viable tool, onboard systems need to operate independently of humans (at least the vast majority of the time). This requires a dozen or more onboard sensors, such as cameras that work in both the visible and infrared spectra, LIDAR (light detection and ranging), or multi-spectral cameras; more advanced algorithms for understanding a wide range of environmental conditions; and sophisticated navigation systems that allow UAVs to sense their position more precisely.
There is also a need for improved safety systems—particularly in crowded urban areas. “Currently, drone companies add redundant propellers to avoid crashing. More advanced technology is necessary,” says Davide Scaramuzza, director of the Robotics and Perception Group at the University of Zurich in Switzerland.
At drone manufacturing firms and in research labs, the next generation of drone controls and navigation systems is taking shape. Engineers and computer scientists are taking aim at various challenges, including how to process visual information at speeds reaching near 100 mph (160 kph), how to teach UAVs to react to unknown obstacles, what to do if the drone does not know how to respond to a given situation, and how to take over the controls for malfunctioning, rogue, or dangerous drones that may pose a threat. Not surprisingly, many of the decisions involve trade-offs. For example, it is already possible to fly an autonomous drone that has a very low probability of colliding with objects or crashing—as long as it flies at a very slow speed.
At the center of the challenge is simultaneous location and mapping (SLAM). Eric Amoroso, cofounder of KEF Robotics, a drone company that captured first place in a qualification round for a 2019 Lockheed-Martin UAV challenge, says inaccuracies in sensing and processing algorithms necessitates multiple onboard systems—as many as a dozen conventional cameras, vision sensors using such technologies as SWIR (short-wave infrared), MWIR (medium-wave infrared), LWIR (longwave infrared), LIDAR (light detection and ranging), and radar (radio detection and ranging)—to robustly “see” what is going on around the drone.
What comes naturally to pilots when watching a UAV video stream is considerably more difficult for today’s smartest UAVs.
What comes naturally to pilots when watching a UAV video stream—depth of field and localization of both static and dynamic objects—is considerably more difficult for today’s smartest UAVs. Consequently, researchers are continuing to experiment with different combinations of sensors and SLAM algorithms to guarantee sight in cluttered environments. This includes stereoscopic vision and associated algorithms that help a drone gain depth-of-field and better understand relationships between and among objects—including other moving drones.
Equipping drones with vision and sensing capabilities that operate at the speed of flight is only part of the navigation challenge, however. There is also a need to ensure that a drone can process visual images quickly enough and make intelligent decisions in real time. Microprocessor and component manufacturers have introduced highly specialized chips that use increasingly powerful graphics processing units (GPUs) and accelerator chips to reduce visual processing time to milliseconds. Yet, further improvements are needed. For now, pilots can detect operational anomalies and react more quickly than an autonomous UAV. The ultimate objective for drone manufacturers is to push the devices’ reaction time to the level of professional pilots so they can perform on par with humans, or perhaps even exceed them.
Machine learning will certainly make UAVs smarter and more agile, but it cannot completely solve the speed and latency problem. Moreover, better algorithms cannot anticipate every possible scenario or obstacle the world can toss at a drone. Ultimately, a UAV must be able to react to external events and avoid collisions while staying on course and accomplishing its intended task. Says Amoroso, “While a drone will likely not have the understanding a pilot has of the behavior of everyday objects, it nonetheless must react appropriately and quickly to avoid situations where it can cause harm to others or itself. Maybe the drone doesn’t understand that branches can fall, or doors can open, but if given a robust enough SLAM system, it will still be able to navigate itself safely under such environmental disturbances.”
Gaining Direction
Although GPS technology allows most drones to operate effectively most of the time, a dependence on satellites is not ideal—or even adequate—for companies looking to use UAVs for specialized commercial purposes. Objects such as buildings, trees or mountains might temporarily block signals. GPS also doesn’t deliver the level of performance and precision needed when many drones operate autonomously close together. Without additional vision sensors and on-board navigation systems, collisions could occur, or drones might simply cease doing what they are supposed to do.
More advanced UAVs now incorporate a technology called Visual Inertial Navigation System (VINS) assistance. These systems rely on onboard cameras and inertial measurement units (IMUs) to track a drone’s location when GPS signals are weak or nonexistent, such as in caves or deep valleys. Essentially, they work by detecting and tracking interest points across images and using them as anchor points for the robot to orientate itself, Scaramuzza says. In a certain sense, it’s the drones mapping territory and using the map as they move over land, within caves, or underwater. However, this, too, has limitations since some environments change quickly.
Completely autonomous drones would require a combination of sensors, navigational capabilities, and communications links that push beyond current technology. They may also require new battery recharging systems—on the ground and in flight. Experts believe truly independent UAVs will take to the skies within the next few years, as further advances in computing hardware and software take place. Yet, in some cases, keeping humans in the flight loop may be desirable. This would likely include dangerous situations such as transporting a bomb, sending a drone into an unknown space such as a subterranean environment, or managing swarms of drones in highly cluttered airspaces.
Then there’s the need to create fail-safe systems to prevent UAV crashes. One solution, Amoroso says, is installing anomaly detection systems that alert a human to intervene when the drone can’t navigate or operate normally. Another approach would be to place emergency beacons in commercial drones; if the UAV bumps into an object, it generates an alert or notification. Still another remedy, Curtis says, is programming malfunctioning drones to head to a safe space or simply to land until they can receive further instructions. Regardless of the specific approach, Carnegie Mellon’s Michael says that any procedure leading to a human taking control of the system must be very well thought out. “Relying on a human to suddenly make an instantaneous decision could lead to potentially unsafe results,” he cautions.
DARPA is working on UAVS that will use sophisticated onboard mapping technology to remember places and things they have encountered.
Yet the field is advancing, and even taking new directions. At the University of California Riverside, researchers have experimented with combined cellular signals and Wi-Fi to augment or replace satellite signals. At the Massachusetts Institute of Technology’s Computer Science and Artificial Intelligence Laboratory, researchers are using virtual reality to train drones, and are build more robust algorithms by running virtual drones through simulations. Another team at the university has produced a mapping system called NanoMap that uses a depth-sensing system to stitch together ongoing measurements of the drone’s immediate surroundings. This allows a single UAV—and theoretically a team of drones—to not only adapt motion and movement within a current field of view, but also anticipate how tormove in the hidden fields of view that it has already encountered.
Meanwhile, the U.S. Defense Advanced Research Projects Agency (DARPA) is working on UAVs that require no GPS, but fly at speeds up to 45 mph (72 kph). The devices will use sophisticated onboard mapping technology to remember places and things they have encountered. According to DARPA, the system could be used on the battlefield, and to rescue victims of natural disasters.
Into the Air
Researchers continue to explore ways to take autonomous drones to a higher level. This undoubtedly will revolve around better and more responsive cameras, faster and better image processing, and ongoing improvements in AI. For instance, Scaramuzza is focused on developing event-driven cameras with bio-inspired vision sensors that see only the motion in a scene. These smart pixels would reduce the processing load on the drone and allow it to focus on only the most important motion and activity. It would deliver high dynamic range at low power, even in low light conditions, while greatly reducing motion blur and latency. “I foresee that drones will become smarter and smarter and more and more situationally aware,” he says.
Blending and optimizing existing technologies—and using increased processing power, better batteries, and improved algorithms, will result in additional gains, Michael argues. Part of the solution might also include mesh communication networks that use the collective intelligence of the group to teach and update individuals. This might best be described as realtime and collaborative machine learning. “The more the drones fly, the more experience they acquire. The more experience they acquire, the more they become high-performance machines. This makes them better equipped to navigate and mitigate challenging conditions,” he says.
“We’re moving toward a level of sophistication where onboard sensing systems and machine learning will create an environment that make it possible to step beyond basic navigation and create machines that use deliberate and intelligent decision-making. These systems—including groups of drones—will improve and get smarter over time,” Michael says. “We’re approaching an inflection point where drones will move past the novelty stage and become another capable system that can be used for a wide variety of purposes.”
Kamat, S.U., and Rasane, K.
A Survey on Autonomous Navigation Techniques, 2018 2nd International Conference on Advances in Electronics, Computers and Communications, IEEE. https://ieeexplore.ieee.org/abstract/document/8479446
Simon, N., and Songmahadthai, D.
Multi-drone Control System, Mälardalen University School of Innovation Design and Engineering, Jan. 16, 2019. http://www.diva-portal.org/smash/get/diva2:1292032/FULLTEXT01.pdf
Mozaffari, M., Saad, W., Bennis, M., and Debbah, M.
Communications and Control for Wireless Drone-Based Antenna Array, IEEE Transactions on Communications, Vol. 67, Issue 1, Sept. 20, 2018, pp. 820–834. https://ieeexplore.ieee.org/abstract/document/8469055/citations#citations
Kim, J., Seokhwa, K., Jaehoon, J., Hyoungshick, K., Jung-Soo, P., and Taeho, K.
CBDN: Cloud-Based Drone Navigation for Efficient Battery Charging in Drone Networks, IEEE Transactions on Intelligent Transportation Systems, Dec. 12, 2018, pp. 1–18. https://ieeexplore.ieee.org/abstract/document/8574043/authors#authors
Join the Discussion (0)
Become a Member or Sign In to Post a Comment