News
Computing Applications News

Software on Mars

With the AEGIS system, the Mars Exploration Rovers can autonomously select, capture, and analyze images using onboard logic.
Posted
Spirit rover's arm and surroundings on Mars
  1. Introduction
  2. Autonomy Introduced
  3. Best Practices Confirmed
  4. Further Reading
  5. Author
  6. Figures
Spirit rover's arm and surroundings on Mars
NASA's ill-fated Spirit used its front hazard-avoidance camera to record this forward view of its arm and surroundings during the rover's 2,052nd day on Mars (Oct. 11, 2009).

Since January 2004, scientists have been following the activities of two spacecraft on Mars with the objective of obtaining data from the red planet’s soil and rocks that could offer clues about the presence of water there. The missions, launched by the U.S. National Aeronautics and Space Administration (NASA), have also been testbeds by necessity for computer vision and autonomous analysis capabilities.

Among the most innovative applications NASA computer scientists have contributed to the mission is Autonomous Exploration for Gathering Increased Science (AEGIS), which analyzes images gathered by the rovers’ navigation cameras to identify features of interest, typically rocks with certain preprogrammed characteristics, without needing synchronous communication with scientists on earth. This capability saves significant time and bandwidth on the deep space network. In fact, NASA considers AEGIS so innovative it won the agency’s Software of the Year award for 2011.

The Mars Exploration Rovers (MERs), named Spirit and Opportunity, have served as interplanetary geologists; their onboard instrumentation includes panoramic cameras, numerous spectrometers, magnets, and microscopic tools. The rovers, as implied by their names, have also traversed significant distances for unmanned small vehicles; Spirit traveled more than 7.7 kilometers before going silent in March 2011, and Opportunity continues to travel, having logged almost 35 kilometers since it landed. Opportunity has struck some significant scientific finds, including evidence that whatever crashed into Mars and created the Endeavour crater led to an impact that released heated, underground water that deposited zinc in that rock. The discovery is among those that NASA calls important discoveries about wet environments on ancient Mars that may have been favorable for supporting microbial life. The Endeavour site is about 21 kilometers from the rover’s previous location, Victoria crater, a distance that took the rover almost three years to traverse.

This mobility, however, comes at a cost.

“Communications bandwidth has not grown as fast as rover traverse range,” NASA computer scientists explained in a paper presented at the 10th International Symposium on Artificial Intelligence, Robotics and Automation in Space in Sapporo, Japan, in 2010. “As this trend in increased mobility continues, the quantity of data that can be returned to Earth per meter traversed is reduced. Thus, much of the terrain the rover visits on a long traverse may never be observed or examined by scientists.”

Back to Top

Autonomy Introduced

To meet the scientific goals of the MER missions, the engineers and scientists, who work at the Jet Propulsion Laboratory (JPL) in Pasadena, CA, built a science platform called Onboard Autonomous Science Investigation System (OASIS), which enables the rovers to autonomously perform image and data analysis, planning, execution, and interaction with robotic control without real-time human direction.

“We have a hard time getting all the data down quickly,” says Tara Estlin, a senior member of the JPL artificial intelligence group that developed OASIS. “Communication bandwidth is a restricted and precious resource and sometimes it can be several days, if not longer, for data from a certain area to come down. It could come down after the rover has left an area, maybe days later.”

AEGIS, which is a subset of OASIS, was uploaded to Opportunity in December 2009. AEGIS uses onboard data-analysis techniques to seek out scientist-defined high-quality targets with no human in the loop. Prior to AEGIS, images were transmitted from the rover to the operations team on Earth; scientists manually analyzed the images, selected geological targets for the rover’s remote-sensing instruments, and generated a command sequence to execute the new measurements.

The new approach, Estlin says, is a boon to the overall goals of the MER mission “because we can’t have the rover stay in every area long enough to look around and take images of every spot. So this gives Opportunity the ability to do reasoning onboard about what’s interesting.”

In a seven-step process, AEGIS locates likely targets of high scientific value (typically rocks), prioritizes targets within a wide-field image, and then further analyzes properties of prioritized rocks, such as brightness, shape, and size, using the rover’s narrow field-of-vision tools. The target parameters are sent to the rover by the craft’s sequencing team at JPL, based on information supplied in advance by the MER science team.

“During a day when we’re going to be planning a drive, the scientists fill out the request for AEGIS to be run,” Estlin says. The scientists typically specify what point of the drive they want it to be run in, what space around the rover they want it to look at, and what would make an interesting target.

“The way they do that is they have a number of different ways to specify a rock’s properties,” she says, “everything from the size of the rock to the shape, reflectance, or the brightness of the image, and they can choose one or two parameters to emphasize. That information goes up with the sequence along with everything else that is going to be done that day or the next several days, and everything is done automatically.”

Thus far, Estlin says, AEGIS has been used to collect targeted, 13-color filter, panoramic camera images on a number of different terrain features including rock outcrop, crater ejector, boulders, and cobbles. These color images identify different properties of Mars surface materials including physical, mineralogic, and photometric properties, and have contributed to determining the geologic and aqueous history of Mars.


OASIS enables the Mars rovers to autonomously perform image and data analysis, planning, execution, and interaction with robotic control without real-time human direction.


AEGIS is based on the principles of the Canny edge detector algorithm, which finds likely edges through a process of calculating intensity gradients in an image; essentially, the algorithm in a Canny-derived detector employs hysteresis to denote a minimum threshold of differentiation between pixels in a selected range of an image that marks an edge.

The AEGIS algorithm is called Rockster, and the JPL team conserved on computational resources by employing Canny techniques such as image smoothing in a preprocessing mode. This reduces the total number of edge elements detected.

Benjamin Bornstein, the project’s software development lead, says he has been very satisfied with the trade-off the team needed to accept in order to both deliver useful data and conserve resources. The team reported a small number of false positives, including cable ties on the rover deck that had extended into the field of view, and during an experiment in which AEGIS was allowed to consider targets below 25 pixels in size.

“We have the ability to adjust some of the parameters of the algorithm that indicate how aggressive it will be at trying to find rocks,” Bornstein says. “We’ve been very happy so far with our particular implementation and the choice of parameters we typically run.”

Back to Top

Best Practices Confirmed

While AEGIS’s function marks a pronounced shift in the way interplanetary science will be conducted, Estlin and Bornstein say there were no revolutionary methodologies or technologies used in its creation. In fact, they say, the limitations of space platforms actually compelled them to “think old.”

“Processors are typically several generations behind what you may find on your desktop,” Estlin says. “Things that might take a few seconds to run on your desktop could easily take minutes to hours on a flight computer.”

MER’s processor, for instance, is a 25 megahertz RAD6000, a radiation-hardened version of the IBM RISC single-chip CPU, which features 128 megabytes of RAM and 256 megabytes of flash memory, “several orders of magnitude slower than what you might expect,” Estlin says.

AEGIS also has a four-megabyte cap of available RAM, and the fact that it often processes images of more than one megabyte each dictated that the developers employ various conservation techniques, such as bit packing and representing data as bit images. The AEGIS software was written in C, which Bornstein says proved ideal for the operations with which his team was charged.

“With languages like Java, or especially C++,” Bornstein says, “abstractions can be convenient, but there are a lot of implicit operations that happen, such as when copy constructors are invoked, or destructors or assignment operators, or if you have any sort of operator overloading. Those implicit sort of function calls, unless you’re an absolute expert in the code base and know exactly how everything was designed, can actually create real problems when reasoning about a piece of code. Whether you’re looking at it with an automated tool or with human eyes, trying to determine exactly what a particular line of code is doing, we want to keep things as simple as possible.”

The AEGIS software also underwent painstaking testing, including 348 unit tests, automated regression testing, and extensive run-throughs in JPL’s onsite “Mars yard,” a simulated Martian landscape, on rover hardware. Bornstein says the code was also examined line-by-line by members of the AEGIS team, JPL machine vision and artificial intelligence experts who were not on the team, and JPL experts familiar with the other code onboard the MERs with which the AEGIS software would interface. However, there was no novel or unique testing regimen simply because the software was destined for use millions of miles from Earth.

“I wish we could say there was a nice little reinvention story here, but there’s no panacea,” Bornstein says. “You layer good practice on top of good practice, each layer adding insurance and catching problems along the way. We had very standard development practices, or at least what I would hope would be standard development practices.”


AEGIS is based on the principles of the Canny edge detector algorithm, which finds likely edges through a process of calculating intensity gradients in an image.


In fact, while it may seem counter-intuitive to think that developing vision software for space applications may be simpler than for terrestrial platforms, JPL computer vision supervisor Larry Matthies says there are sound reasons for it.

“In many ways, the terrain is less complex,” Matthies says. “You have no vegetation or water; you have basically desert. You have some dust storms, but where we’re operating there is basically no weather. You have effectively no shadows, because the only thing casting shadows is the rover and we can arrange things so shadows aren’t a problem. And the rovers are moving pretty slowly, so even with very limited computation power, we can get by with simpler algorithms.”

Yet Matthies says the research on various components of the Mars missions has yielded benefits for Earth-bound projects, such as breakthroughs in stereo vision, visual odometry to help autonomous vehicles cope with slippage over uneven terrain, and hazard detection and obstacle avoidance during landing.

For example, he says, through 1990, stereo vision was considered to be very expensive computationally and error prone. As a result, researchers focused on algorithms that computed 3D information at high-contrast places in images, like corners of objects or distinctive edges. “Unfortunately, this produced pretty sparse 3D information, and it wasn’t adequate for Mars,” Matthies says. He created a much faster algorithm that produced 3D information at many more points in the image with high reliability in 1990 at JPL, which is still being used on the Curiosity rover.

While a graduate student at Carnegie Mellon University in 1986, Matthies also discovered a new algorithm, which is being used to improve navigation. “It could estimate where a robot moved, much more accurately than previously, by using onboard stereo cameras to track distinctive points in the environment, which essentially served as local landmarks,” he says. “The heart of this innovation was a better understanding of statistical measurement errors. This class of algorithm is now called visual odometry.”

Estlin says AEGIS will be uploaded onto the newest rover, Curiosity, which landed on Mars in early August, during its first year of operation. Although Curiosity’s capabilities exceed those of its predecessors, Bornstein says AEGIS will have to share the new hardware.

“There are a lot of other things running and consuming resources,” he says. “There will be an improvement, but maybe not as dramatic as we would like it to be.”

Back to Top

Further Reading

Canny, J.
A computational approach to edge detection, IEEE Transactions on Pattern Analysis and Machine Intelligence 8, 6, June 1986.

Castano, R., et al.
OASIS: Onboard autonomous science investigation system for opportunistic rover science, Journal of Field Robotics 24, 5, May 2007.

CMU Robotics Institute
AEGIS Automated Targeting for the Mars Exploration Rover Mission, http://www.youtube.com/watch?v=X9ortg6NTiU, Nov. 15, 2010.

Estlin, T.A., et al.
AEGIS automated targeting for the MER Opportunity rover, 10th International Symposium on Artificial Intelligence, Robotics, and Automation in Space, Sapporo, Japan, Aug. 29–Sept. 1, 2010.

Matthies, L., et al.
Computer vision on Mars, International Journal of Computer Vision 75, 1, Oct. 2007.

Back to Top

Back to Top

Figures

UF1 Figure. NASA’s ill-fated Spirit used its front hazard-avoidance camera to record this forward view of its arm and surroundings during the rover’s 2,052nd day on Mars (Oct. 11, 2009).

Back to top

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More