Alexa’s voice-command software is getting tweaked for spaceflight, thanks to a joint passion project of Amazon, Cisco, and Lockheed Martin.
Dubbed Callisto, the Lockheed-built device is designed to help improve life in space. The goal is for future astronauts to one day speak commands to onboard computers and video conference with folks on Earth from deep space. It combines flight data from NASA’s upcoming Artemis 1 uncrewed launch with Cisco‘s Webex video conferencing system and Amazon’s AI-driven virtual assistant and intercom technology. Engineers from these companies will test the first stage of this technology when the mission lifts off August 29 from Kennedy Space Center for a 42-day voyage around the moon and back.
“All of this is aimed at improving the astronauts’ quality of life and making them more productive,” says Brian Jones, Lockheed Martin’s chief engineer for Callisto. “They’re probably some of the most scheduled people on Earth–or in this case, beyond Earth. They have to schedule their spare time, so anything that makes them more efficient [helps].”
Those with Alexa in their homes can follow the mission by saying, “Alexa, take me to the moon” to then ask about the flight, such as the spacecraft’s cabin temperature or how far it is from the moon. Meanwhile, Callisto engineers are still pinching themselves that a fun side project–one developed during their spare time at work–is actually going to space.
“It’s a fantasy project for me,” says Clement Chung, the applied science manager for Amazon Alexa AI. “I love Star Trek and this is exactly what you would hear Captain Kirk speaking to the computer. When I heard about it, I was like, ‘Hey, yes! Sign me up for this!’”
Challenges of space
The three-year-old Callisto project began when NASA was scouting for novel industry-funded technology ideas with future applications whose onboard demonstrations could engage the public. The concept of a voice-activated computer interface intrigued Jones enough to gather a handful of engineers from the three companies to see if it would work. Since then, hundreds of people have contributed to the project.
Callisto uses the same technology that lets Alexa respond despite losing Wi-Fi connectivity at home or cell reception while driving. But space offers a whole other set of challenges, from cabin acoustics and vibrations to reduced bandwidth and up to a 12-second delay relaying information from the moon to Earth and back.
For immediate flight data, Alexa would need to interface directly with the computer of the Orion spacecraft, which will eventually carry astronauts. But because NASA did not want Callisto sending commands to the vehicle on this flight, the engineers had to translate the raw flight-monitoring data from Orion’s software architecture, alongside Alexa and WebEx software, into understandable information and displays on an iPad. That also required prioritizing which of Orion’s 120,000 flight-data points–i.e., spacecraft orientation, water supply levels, cabin temperature–were most helpful to the crew.
The engineers queried astronauts on such voice commands. “Nobody ever wants to fire the thrusters using a voice assistant. That’s a mission-critical activity,” says Jones, dispelling unnerving comparisons to Hal, 2001: A Space Odyssey’s rogue computer.
Often, astronaut requests were relatively mundane. “There have been times they’re taking notes and the pen or pad will just float away and they won’t know,” says Alexa senior UX designer Justin Nikolaus. “So it’s having a hands-free voice experience to take notes or set a timer whenever they need it, and fading away when they don’t.”
Less pressing communication, like teleconferencing with loved ones or getting sports scores, would relay through the Deep Space Network (DSN), an international array of giant radio antennas that transmit data between Mission Control and spacecraft traveling to the moon and other planets. For that, engineers had to figure out a way to encode the audio and video so the data can traverse the DSN’s narrow bandwidth and Callisto play it back at high fidelity. That’s a tall order considering an uplink on par with a dial-up modem (roughly 2 Kbps) and a downlink equivalent to first-generation broadband (270 Kbps). For comparison, the International Space Station, just 250 miles from Earth, has a 600 Mbps connection.
“That was a balancing act for us to get both of those into that same small pipe and meet everybody’s quality requirements,” says Jones.
What is Artemis?
The upcoming test flight is the first of NASA’s Artemis missions, done in partnership with international space agencies and private industry, that aims to take humans to the moon (including the first woman and person of color) as early as 2025. Farther-reaching efforts involve building a lunar-orbiting space station and base camp to further scientific study, economic opportunity, STEM careers, and prepare for an eventual trip to Mars.
Artemis I will test drive the massive Space Launch System (SLS)–the world’s tallest and most powerful rocket at 30 stories and 8.8 million pounds of thrust–and the Orion capsule, whose journey folks can follow on Twitter. Orion will separate from the SLS to travel 280,000 miles from Earth, farther than any spacecraft designed for humans, for a total journey of 1.3 million miles. It will orbit the moon to within 62 miles from its surface and extend 38,000 miles past its far side. October’s reentry will have Orion hitting the Earth’s atmosphere at 25,000 mph and heating to 5,000 degrees Fahrenheit–faster and hotter than other returning spacecraft–with the atmosphere, maneuvering jets, and 11 parachutes slowing it to a 20-mph splashdown off the coast of San Diego.
Callisto (named for a nymph who served the Greek goddess Artemis) is one of 10 onboard science payloads that will record images and such data as lunar ice and hydrogen levels and magnetic field strength in space. Suited mannequin torsos with simulated tissue will gauge g-force and radiation impact on the human body, a top challenge to deep space explorers.
The onboard experiment
Callisto will also include two monitoring cameras and lights so the ground crew can view the inflight test. Engineers will engage the payload from an operations suite at the Johnson Space Center Mission Control in Houston that’s equipped with servers, a massive video wall, and microphones. Those will enable them to see and hear the apparatus in action as their faces appear on the iPad and Alexa’s familiar feminine voice responds to their commands. Although data will travel to and from the craft through the DSN, a cloud-connected ground device will also enable external guests and STEM classrooms to connect with Callisto.
If all goes well, the engineers hope to continue tailoring Callisto’s spaceflight capabilities and accuracy on subsequent crewed missions. Those improvements might find their way into established Alexa and Webex systems and new applications for use on ships and in extreme locations without broadband.
But first, to hurdle this initial demonstration. “They’ve been talking about having some watch parties at Cisco,” says Cisco engineer Nathan Buckles. “I tell them, ‘You can invite me, but you have to realize I’m gonna cry like a baby when this thing happens for the first time.’”