Smiley face
Weather     Live Markets

Summarize this content to 2000 words in 6 paragraphs

Amazon Web Services demonstrated data processing in orbit in 2022, using a payload on D-Orbit’s ION satellite carrier, shown in this artist’s conception. (D-Orbit Illustration)

Artificial intelligence and machine learning are turning into requirements for space operations, and Amazon Web Services is optimizing its products to reflect that view, according to the former Air Force major general who’s now in charge of AWS’ aerospace initiatives.

“AI, ML, generative AI have become table stakes for our future on-orbit systems and capabilities,” Clint Crosier, director of aerospace and satellite solutions at AWS, said today during Booz Allen Hamilton’s annual Space + AI Summit. “We have reached the limit of human capacity to digest petabytes and petabytes of data in real time and make any sort of intelligent decisions about them. We’ve culminated, so we must further embrace AI, ML and generative AI capabilities for the future.”

Crosier and other speakers at the summit, conducted at the headquarters of the Air & Space Forces Association in Virginia, pointed to the rapidly rising number of satellites in low Earth orbit as a major factor behind the need for more sophisticated AI tools. Over the past decade, that number has risen from about 1,300 to more than 10,000.

Merely keeping track of all those satellites is a challenging task — and it’s just as challenging to send all that data down to Earth for processing.

Enhancing the onboard capabilities of the satellites themselves — in effect, moving edge computing into orbit — is one of the strategies favored by AWS. In 2022, AWS and its commercial partners successfully tested a system that processed satellite image data in space. “We reduced overall bandwidth requirements by 42% while achieving 100% mission accomplishment,” Crosier said.

Clint Crosier is director of aerospace and satellite solutions at Amazon Web Services. (Amazon Photo)

Crosier talked up the idea of turning cutting-edge innovations into commercial off-the-shelf components, or COTS. He said AWS is looking at ways to optimize its hardware to cope with the challenges of the space environment, including limited power availability and heightened exposure to radiation.

“We’re going to have to develop some purpose-built things optimized for allowing us to do advanced AI, ML and generative AI on orbit that may not exist today, but are going to be COTS tomorrow,” he said.

AWS is also working with NASA on a variety of projects aimed at harnessing the power of AI for space operations.

“NASA has already started porting many of their technical manuals into AWS’ generative AI capabilities, such that you can do a RAG chatbot right now in certain parts of NASA and say, ‘Give me all the specs on a human lander capability, and modify, you know, X or Y by mass or payload,’” Crosier said. “And the system will come back and provide you all that in recommendations.”

That capability could be a lifesaver for future missions to Mars, where communication challenges could make it difficult for astronauts to get real-time assistance from engineers back on Earth.

Crosier referred to the classic tale of Apollo 13 in 1970, when Mission Control scrambled to rescue the crew cope after an oxygen-tank explosion in space.

“Think about porting that into the future,” he said. “‘Houston, I have a problem’ becomes ‘Houston, I have a solution,’ because you’ve got this autonomous capability on the surface of Mars. Here’s all the in-situ resources I have. Here’s the storage and compute capability I have. Now go generate me three courses of action to solve the problem that I have. And gen-AI systems will bring back courses of action that will solve whatever challenges you’re facing.”

For years, NASA has been working on a similar project to harness AI for on-the-scene medical diagnosis in space.

“Sometimes somebody gets sick, for example, and depending on the orbit, it could take up to 40 minutes between you sending a signal from Mars until you get it back from Earth,” said Omar Hatamleh, who’s the chief AI officer at NASA’s Goddard Space Flight Center. “So we’re creating something called ‘Doctor in a Box’ as well. Imagine, if you have medical issues, you can have interactions with these, and these systems are trained specifically on medical domains.”

Hatamleh said other types of AI systems could be trained as robotic construction workers — to build habitats on Mars before the humans arrive, or to extract the raw materials necessary to support them while they’re there.

He acknowledged that giving AI agents a bigger role in space exploration could raise challenges worthy of a science-fiction tale. For example, suppose a robot goes out on an expedition with two human astronauts, and both of the humans are injured in an accident. “Which one does the humanoid robot choose to come back to the base?” Hatamleh asked.

He noted that science-fiction writer Isaac Asimov came up with what he called the Three Laws of Robotics, the first of which declared that “a robot may not injure a human being or, through inaction, allow a human being to come to harm.”

“But what if we have on-the-edge systems? Humanoid robots will be surgeons on the surface of a distant planet. … The fact that it’s doing an incision on a person — that’s harming a person, and that goes completely against the laws of Asimov,” Hatamleh said. “So, even the most fundamental, basic laws that we abided by for a long time need to be re-evaluated, reassessed for the next evolution of these technological advances.”

Share.
© 2025 Globe Timeline. All Rights Reserved.