Skip to main content

Autonomous spacecraft? Baby steps

Robots exploring on their own and self-piloting spacecraft are a long way off, says NASA computer scientist


Robot and astronaut meeting in space illustration
|
February 20, 2020

Robotic rovers trundle across the Martian highlands, eyeing the terrain and stopping to scoop up promising samples. A spacecraft powers across the cosmos while the crew sleeps. On the moon, a habitat’s brain automatically notices CO2 levels are off and adjusts them.

It’s the stuff of science fiction. And it’s going to stay that way for awhile, said a NASA computer scientist working on putting artificial intelligence into space.

NASA has taken a few tiny steps in deploying AI, but robots exploring on their own and self-piloting spacecraft are a long way off, said Jeremy Frank, group lead of the planning and scheduling group in the Intelligent Systems Division at NASA Ames Research Center. He spoke to an Arizona State University audience Tuesday in a talk sponsored by the School of Computing, Informatics, and Decision Systems Engineering AI Group, in the Ira A. Fulton Schools of Engineering.

A few small subsystems have been tested in flight, including water purity testing machines, a warning system on the Orion crew capsule and a management system for laptops on the International Space Station, which are critical pieces of equipment.

(Fun fact: 180,000 pieces of information come down from the ISS every day, not counting payload data, according to Frank).

Frank works on the development of automated planning and scheduling systems for use in space mission operations; the integration of technologies for planning, plan execution, and fault detection for space applications; and the development of technology to enable astronauts to autonomously operate spacecraft.

The Artemis program — getting humans to the moon by 2024 — is a stepping stone to Mars, but it’s going to use many of the same technologies.

That technology will require human interfaces and flight software integration.

“If we’re going to make our spacecraft autonomous, that’s what we’re going to have to have,” Frank said.

He predicted that will involve a combination of high performance computing and machine learning. “We’re not going to have our spacecraft learn” in flight, he said. (There’s also no way a flagship mission is going to go without ground communication, he added.)

How far has NASA come? 

A warning system was tested on the first flight of the Orion crew capsule six years ago. A test project called the Advanced Caution and Warning Systems monitored the health of the Orion’s critical vehicle systems using live data transmitted to the ground during the flight test.

The system was designed to monitor the mission from launch through splashdown to display information about failures and provide information on the effects of failures. The team demonstrated future Mission Control Center and onboard displays that maximized awareness of what was happening during failures. It determined the cause of failures and spotted components affected by failures to provide a comprehensive view of the health of the spacecraft. It also assisted operators using “what-if” queries that identify next-worst failures to help operations teams be prepared for the most critical system issues.

Future human space missions will put crews far from Earth. The one-way light-time delay to the moon is 1.2 seconds. That’s enough to make continuous control from Earth difficult to impossible. The same delay to Mars ranges from 3 minutes to 22 minutes, making communication exponentially more difficult.

“These missions will require changing the capabilities of spacecraft, the roles and responsibilities of ground and crew, and the ways that ground and crew interact during the mission,” a NASA press release said.

“These conversations put into perspective how much work we have to do,” Frank said.

Top illustration: Alex Davis, Media Relations and Strategic Communications

More Science and technology

 

Student using laptop computer

ASU class explores how ChatGPT Enterprise can assist in scholarly writing

Just over a month ago, Jacob Greene received a notification he’d been waiting for — his proposal to use ChatGPT Enterprise was approved. Greene is an assistant professor at Arizona State University’…

Outdoor ASU sign reading "New schools New degrees New buildings" in front of a building.

New engineering degrees at ASU aim to open pathways, empower engineering expertise

It doesn’t take an extensive internet search to discover that engineering has become one of the most rapidly and broadly expanding STEM fields. Engineering has been on an upswing in recent years,…

Graphic illustration of a close-up view of the gut microbiome.

Study: Combining info on genes, gut bacteria enhances early disease detection

Identifying those at highest risk for developing common chronic diseases like heart disease, diabetes, Alzheimer’s disease and cancer is a core priority for preventive medicine. By catching elevated…