Mobile World Congress 2018: Hewlett Packard Enterprise in the International Space Station

NASA and HPE have partnered on a supercomputer that's gathering data for a potential Mars mission

NASA and Hewlett Packard Enterprise (HPE) are now six months into a one-year experiment taking place on the International Space Station. There, two HPE servers comprising the “Spaceborne Computer” have been met with an unending battery of information and tasks—with no traditional space hardware “hardening” before they commenced. The surface goal here is to prove that such a system can handle immense stress through its software alone. The greater goal is to test and prepare equipment for a potential mission to Mars, which happens to be a year-long undertaking. HPE was on hand at Mobile World Congress to discuss the endeavor and outline its commercial impact, and they did so inside of a model spaceship.

“This is about enabling a really big supercomputer to run autonomously,” Ben Bennett, a High-Performance Computing engineer on this project, explains to us. “Failures will be dealt with. It will talk to the OS. Systems will continue to run,” he adds. This is all in light of the fact that they have not been hardened. Bennett says, “Space hardware is ‘hardened’ and it’s ‘shielded’ and it costs a lot of money to do this, and it weighs a lot more than expected and ultimately it makes everything slower than modern technology.” So a lot hinges on these lighter machines. “The economics of space are really gruesome. It costs about a million dollars per pound of matter. You want to get as much performance per weight as possible.”

They are actually “two perfectly regular, off-the-shelf Xeon servers,” he continues. (The machines run Linux.) “There’s nothing special about them. There are 11 SSD storage spots in each. We put a piece of test equipment in the last SSD spot that’s a special passive detector to bring back to earth at the end and look at the radiation that’s struck it.” So these two commercial items, were launched up in the SpaceX Dragon mission back on 14 August. They occupy a double slot, or two NASA form factor racks, giving them 500 watts of power and two inverters that feed redundant power supplies.There are also ethernet switches and a cold water loop.

“We thrash the living daylights out of them with high-performance computing benchmarks,” Bennett says, diving into the tests. “We are running LINPACK which is a matrix solver. We are running conjugate gradient solvers. We’ve got the NASA parallel benchmarks, which beat on processor and memory. We are doing big IO tests to beat on the solid state discs all the time. We are overloading them 24 by 7.” In lieu of hardware hardening, they are calling this software hardening. Machine learning is integral, as the servers themselves make adjustments to processor speed and memory refresh. They hope (and expect) that after a year, everything will continue to run.

Here’s how it impacts a future Mars mission: “in the future you will need the fastest computing power possible inside the spaceship,” Bennett makes clear. “The farther you go from Earth, the harder it is to get information back and forth. It takes longer; it takes so much bandwidth.” This means complicated, correctional analysis will need to be done in a flash, from a low-weight machine. The Spaceborne Computer might just be it. Right now, it’s achieved one teraFLOP, or over one trillion calculations per second. Consider it’s gone around the planet 2,800 times in the last under 200 days, that’s quite a bit of math.

Second image by David Graver, all other images courtesy of NASA