Orbital plans 2027 test for AI data centres in orbit
Tue, 14th Apr 2026
Orbital has set 2027 for the first test mission of its planned AI data centres in low Earth orbit, backed by funding from a16z Speedrun.
The Los Angeles-based start-up is building satellite-based computing infrastructure that would use solar power for electricity and radiate heat into space for cooling. It is also opening an R&D site in Los Angeles, called Factory-1, as it prepares for the mission, known as Orbital-1.
The plan targets a constraint many technology groups now face as demand for AI computing rises: access to electricity and cooling for data centres on Earth. Orbital argues that operating servers in orbit could ease pressure on terrestrial grids by drawing energy from solar arrays attached to satellites.
Its first satellite is due to launch on a SpaceX Falcon 9. The initial aim is to test whether graphics processing units can operate continuously in orbit, assess radiation hardening and, after validation, support commercial AI inference workloads.
Orbital was founded by Euwyn Poon, who previously founded micromobility company Spin, later acquired by Ford. He has described energy supply as a central obstacle to further expansion in AI computing.
Rather than trying to move all forms of AI processing into space, Orbital is concentrating on inference. That is the stage at which trained models respond to user prompts or other requests, and the workload can be spread across multiple separate nodes more easily than the tightly linked systems used to train large models.
Each satellite in its planned constellation would contain a cluster of NVIDIA-powered servers. Orbital says a sun-synchronous orbit would provide near-constant solar exposure, avoiding interruptions from weather and reducing reliance on grid power.
Funding push
Orbital did not disclose the size of the investment from a16z Speedrun. Even so, the backing gives the company support from an investor known for early-stage bets on technology founders tackling difficult infrastructure problems.
Andrew Chen, General Partner at a16z Speedrun, described the investment as support for a technically ambitious project. "Speedrun backs founders to explore ambitious ideas - the harder the problem, the better," he said. "Orbital is taking on AI's biggest constraint with a bold and radical idea."
Orbital is also filing with the US Federal Communications Commission for approval to deploy a constellation of satellites dedicated to AI computing infrastructure. That will be a necessary step if it is to move beyond a single demonstration mission and establish a broader network in orbit.
Space compute
Interest in alternative data centre designs has grown as AI groups seek new sources of power, lower operating costs and better ways to manage heat. Most of that effort has focused on terrestrial solutions, such as dedicated power generation, locating facilities near energy-rich regions or redesigning cooling systems. Orbital is taking a more unusual route by shifting part of the infrastructure off the planet.
The technical challenge is considerable. Hardware in space must withstand radiation, launch stress and remote operation while maintaining stable performance over time. Communications latency, launch costs and satellite maintenance also remain practical issues for any commercial service based in orbit.
Orbital argues that not all AI workloads face the same constraints. Large-scale model training typically depends on dense clusters of GPUs communicating with very low latency, making it difficult to distribute those systems across many satellites. Inference, by contrast, can be divided into discrete jobs handled independently, making a constellation model more plausible.
Poon said the limits on AI expansion are already visible in energy supply. "AI progress is being constrained by the grid," he said. "Data center economics are dominated by electricity and cooling, and both are getting harder. In orbit, solar power is continuous and cooling is fundamentally different. Orbital is building compute infrastructure that removes the energy ceiling and scales with AI's potential."
He returned to that point in a second comment on the broader rationale for the company's strategy. "The energy ceiling on AI isn't theoretical, it's a real constraint that will impede the advancement of intelligence," Poon said. "This is the solution."