This morning, the Artemis 1 flight was supposed to launch on a mission to the moon, but the flight experienced setbacks due to engine issues and has been postponed. If and when the massive Space Launch System does get off the ground, this uncrewed test mission will push the Artemis program forward for NASA, which aims to eventually return astronauts to the moon. Orion, the spacecraft on top of the SLS being used for the test, will have no humans aboard, but three companies—Lockheed Martin, Cisco, and Amazon—have teamed up to assemble a tech package that could give NASA additional info on what’s happening onboard.
The collaboration, given the project name Callisto, has been four years in the making. It was announced to the public earlier this year. This demo will test whether Alexa could be used to control certain spacecraft functions like lights, or fetch telemetry data hands-free, and whether Webex could be used to establish a secure and stable connection for video-conferencing and more.
Here’s a peek at how the companies got their Star Trek-inspired tech space-ready.
A beta test in space
For both Cisco and the Alexa team, after they were pitched the idea by Lockheed Martin in September 2018, they made a game plan, and kicked off official work in 2019. From then on, both teams came up with designs, worked through kinks with hardware and the software, and put on the finishing touches.
“The thing that I thought was interesting when I started working with Lockheed Martin was their approach to this project was very customer-focused,” says Justin Nikolaus, lead voice UX designer for Alexa. “Astronauts’ time in space is very expensive, it’s very scripted, and they wanted to make them as efficient as possible. They’re looking for different ways for astronauts to be able to interact with the vehicle. Voice is one of those mechanisms.”
The goal from the beginning was to make the lives of end users—the astronauts—easier, safer, and more enjoyable. Nikolaus refers to this current attempt as “probably a beta test of sorts.”
But the journey of adapting Earth tech for space was filled with many unique challenges.
“At every major milestone, we started having more hardware. I know modeling, I know machine learning, but space is a brand new frontier for me,” says Clement Chung, the applied science manager at Alexa. “We could plan for and guess what it looks like, but as we get different hardwares and the spaceship to start testing, a lot of our assumptions go out the door.”
[Related: The big new updates to Alexa, and Amazon’s pursuit of ambient AI]
The acoustic sounds in the space capsule, the closed environment of the spacecraft, and what astronauts would want to say or how they want to interact with Alexa were all factors they had to consider in their initial design. Additionally, the team had to integrate Alexa and the Amazon cloud with NASA’s Deep Space Network (DSN) and add local processing capabilities to reduce latency.
To start, the team worked to adjust Alexa’s abilities and tweak it for a different audience in a different setting.
Using space sounds provided by Lockheed Martin, and considering the materials used onboard, they retrained their models, as the noise-to-signal ratio was different. Cisco, to this end, will use their background noise-canceling software to work around this challenge.
Also, Alexa, the voice assistant, had to learn a lot of new knowledge. “Just from a content perspective, I interviewed former astronauts, flight controllers to understand what astronauts want to ask and how to deliver that information,” Nikolaus says. For instance, Alexa has access to 100,000 telemetry endpoints on Orion, from the science data it’s gathering on its mission, to the temperature and functions on different parts of the spacecraft, to where it is positioned and headed in its flight.
“I had to turn that technical data into an easily digestible sentence for Alexa to speak back to the crew members,” Nikolaus says. “There’s a lot of nuances for this environment and this specific user that we had to learn and research and make appropriate for the flight that’s going to happen.”
Above the clouds, and beyond the cloud
Beyond this, the team had to work out the bigger infrastructure questions. The device had to be robust, as it’s not like a repair team could easily update it. “When you go out to space you need to consider radiation, you need to consider shock, vibrations, temperature control, the components that are in there,” says Philippe Lantin, principal solutions architect for Alexa voice service. “You really can’t have batteries, for example. I thought this was a great learning experience for us to design things in a certain way that may be more resilient.”
Once the device was constructed, they had to sort out how best to divide Alexa’s functions between the cloud and an onboard computer. “We had to figure out how we get the voice of the VIP that’s on the ground to the craft. So we had to create that technology from scratch,” says Lantin. A big focus for the project was on making a self-contained version of Alexa that doesn’t rely on the cloud. “There’s a very reduced bandwidth that we’re allowed to use for this project. The total amount of time between when you speak and when you hear a response could be 13 seconds.”
Adjusting to the bandwidth availability and latency was also a problem Cisco worked to overcome. In addition to video-conference, Cisco has cameras set up as their eyes around the spacecraft. “Right now we’re firing up megabits worth of video, and that’s just not a reality with deep space exploration where we rely on NASA’s deep space network,” says Jono Luk, VP of product management at Cisco..
Amazon engineers determined that using a cloud-based service to Alexa, where it reaches out to Amazon servers to give you a response, would not be practical in space. So they expanded upon a feature called local voice control, which can be found in certain Echo family devices for tasks like fetching the time, date, and smart home commands like turning on and off lights, and changing its color.
While people on the ground are used to being able to ask Alexa to turn on the lights, Alexa in space will do something similar: It has the ability to control the connected devices onboard Orion. So there is a set of lights that Alexa will be able to control, and Webex cameras would be used to confirm whether Alexa is turning on and off the lights
But Alexa will not be all alone in space all the time. The last question the team is testing is whether Alexa could make certain requests that go back down to a secured cloud on Earth to be processed. These would be if astronauts wanted to ask for sports scores or keep up to date on whatever’s happening on Earth. “There’s some things that you just can’t know ahead of time,” says Nikolaus.
If the tech demo goes well on the uncrewed mission, both companies could imagine future applications where this tech could help human crew onboard, even just with small things to start, like setting timers, schedules, and reminders.
Cisco is already testing features like annotating pictures to provide instructions for how an astronaut should proceed with an experiment in space from the ground, for example. Even more ambitiously, Luk imagines that they could integrate more immersive experiences in future tests, such as augmented reality or the hologram function.