Advertisement

AI: NASA’s Curiosity rover can now choose its own laser targets on Mars

Share

Who’s calling the shots now? After nearly four years on the job, NASA’s Curiosity rover is finally making certain scientific decisions on its own. The Martian explorer now picks some of the rock targets to blast with the laser on its ChemCam instrument.

A software upgrade known as AEGIS allows the rover to make key decisions when Mars is out of sync with Curiosity’s handlers at NASA’s Jet Propulsion Laboratory, delivering more data in less time. It’s the first time a robot has been able to choose such science targets autonomously on any planetary mission.

“Time on Mars is valuable and we get more data this way and we get the data much faster,” said AEGIS team member Raymond Francis, a scientific applications software engineer at JPL.

Advertisement

This is not AEGIS’s first rodeo: The software was also used by NASA’s earlier rover Opportunity for a slightly different purpose (picking which targets to photograph with a narrow-angle camera).

Other types of “artificial intelligence” have also been installed on various satellites and rovers for practical purposes. One type lets Curiosity navigate its own way across a rocky, debris-filled landscape rather than having to follow step-by-step directions from the ground. Another type is used by a variety of spacecraft to manage their onboard systems autonomously.

But Curiosity’s new abilities are, in at least one way, unprecedented.

“This is a new kind of thing in some ways because it’s science autonomy where it’s making decisions actually about science measurements, and not just about navigation or housekeeping,” Francis said.

This is a significant change, because Mars and Earth’s days are often out of sync, and so Curiosity has to wait many hours until the scientists and engineers wake up, read its latest results and then issue instructions. It takes valuable time that Curiosity could be using to do actual science.

The new software also allows the rover to point the laser at really tiny targets that a human handler would have trouble hitting with accuracy from the control room at JPL.

“AEGIS can be used to ensure that you hit that small feature, and that can save you a whole day of trying again,” Francis explained.

Advertisement

But is arming a laser-shooting space robot with artificial intelligence a good idea? Don’t worry, Francis said. Although this might qualify as AI, it’s still comparatively rudimentary and extremely limited in scope – it’s very good at identifying rocks based on visual cues, basically.

“That rover has a maximum speed of a few centimeters per second, and [ChemCam’s laser] has a maximum range of seven meters, and Earth is much farther than that away,” Francis said with a chuckle.

The same kind of artificial intelligence is proving useful on Earth as well as in space, as engineers try to build better rescue robots that can keep working even when communications are down and they can’t receive instructions from their human handlers. At the DARPA Robotics Challenge in Southern California last year, many teams programmed their robots to have a certain amount of autonomy for some tasks, because having a robot that can make decisions on its own in an emergency is crucial when time is of the essence.

In some ways, Curiosity’s hardware is beginning to show its age — its wheels, for example, are pocked with holes from some rough rides across the Martian terrain. But even as it gets older, the new AEGIS upgrade shows that the rover can keep growing wiser.

amina.khan@latimes.com

Follow @aminawrite on Twitter for more science news and “like” Los Angeles Times Science & Health on Facebook.

Advertisement

MORE IN SCIENCE

At DARPA challenge, rescue robots show future of disaster relief

Dark matter eludes the world’s most sensitive detector: What’s next?

A hunt for dark matter in a former gold mine

Advertisement