The race to build data centres in space
In January 2026, SpaceX applied with US regulators for approval to launch up to 1 million satellites designed to operate as solar-powered AI data centres in orbit. A few months later, Reuters reported that SpaceX itself warned prospective investors that these space-based AI data centre ambitions rely on unproven technologies and may never become commercially viable.
Orbital data centres are clearly no longer science fiction, but they are still a speculative bet rather than an inevitable future. That bet is being driven by a very real problem on Earth.
McKinsey estimates that by 2030, data centres will require $6.7 trillion CapEx to keep pace with demand for compute power, with most of that spending tied directly to AI workloads. Global data centre demand could almost triple by 2030, with about 70% of that demand coming from AI.
The scale of the terrestrial build-out is already enormous. Time reported that the world had roughly 8,000 data centres in 2021 and about 12,000 five years later. The US remains the clear leader, with more than 5,400 facilities, while Germany and the UK are among Europe’s largest markets. The world’s largest data centre, in China, is currently 10.7 million sq ft and there are plans for a site almost double this size in the US. On top of their footprint these sites consume huge amounts of electricity and water: some of the largest facilities can use up to 5 million gallons of water per day for cooling.
But despite the cost and environmental impact, it’s energy that is becoming the biggest constraint. A World Economic Forum article citing McKinsey noted that US data centres could account for up to 12% of total electricity use by 2030. In the UK, grid access has become a bottleneck in its own right: government and parliamentary sources now refer to waits of up to 15 years for some projects to secure a connection.
That is why some companies are looking upward. The idea behind orbital data centres is simple enough: if power, land, cooling, and local grid congestion are constraining terrestrial build-outs, then space offers a radically different infrastructure model. Research and Markets, cited by Time, projects off-planet storage and processing could become a roughly $39.9 billion market by 2035. That does not mean space-based compute will win, but it does explain why venture-backed companies and major tech groups are taking the idea seriously.
Who is actually building toward it?
SpaceX is the loudest name in the conversation. Its FCC filing described a plan to place AI compute in orbit using near-constant solar power, with the economics depending heavily on Starship reaching a much higher launch cadence and much lower cost base.
Y Combinator and NVIDIA backed unicorn Starcloud is further along in physical execution. The US based startup launched a prototype satellite in November 2025, and its long-term vision is far more ambitious: by the early 2030s, it imagines a four-kilometre by four-kilometre orbital structure made up of hundreds of connected satellites and large solar arrays. According to Starcloud’s CEO, one such facility could replace around 50 Earth-based data centres. That is still an aspiration, not an outcome, but it shows the scale of the thinking.
Google is also exploring the idea, though in a more research-led way. In a 2025 research blog, it laid out a “space-based, scalable AI infrastructure system design” and argued that launch economics may improve enough by the mid-2030s to make orbital compute cost-comparable with terrestrial alternatives on a per-kilowatt basis.
Why space looks attractive
The strongest argument for data centres in space is energy. Solar arrays can operate almost continuously in a ‘sun-synchronous orbit’ where there is near-constant sun exposure. The World Economic Forum, citing a Starcloud white paper, says orbital solar arrays could achieve a capacity factor of more than 95%, compared with a median of around 24% for terrestrial US solar farms.
Cooling is the second big draw. The same World Economic Forum analysis notes that the vacuum of space acts like an enormous cold heatsink, with an effective ambient temperature of around -270°C. In theory, that allows passive radiative cooling, reducing both energy demand and freshwater use. For an industry under scrutiny for power use and water consumption, that is a compelling proposition.
There are also operational arguments. Space-based systems could eventually avoid terrestrial grid limits, process satellite and Earth-observation data in orbit before downlinking it, and offer a form of geographical defence against future potential infrastructure attacks that is harder to achieve in one fixed location on Earth. Google’s analysis adds one more important point: if launch costs fall below $200 per kilogram by the mid-2030s, orbital compute starts to look much more economically plausible.
Why this is still a very risky bet
The biggest obstacle remains launch cost. Google’s model depends on dramatic cost declines that have not yet materialised. SpaceX’s own strategy depends heavily on Starship delivering far cheaper and far more frequent launches than is currently achievable today. Starcloud, in turn, inherits both bets. In other words, the economics only work if another currently unproven piece of infrastructure works first.
There is also an environmental and governance argument against moving too quickly. Some argue that putting compute in orbit does not eliminate AI’s resource problem. It may simply shift it into a harder-to-regulate environment. Launch emissions, orbital congestion, and the limited lifespan of hardware all complicate the sustainability case. Orbital debris is already a serious issue: Space.com reports that nearly 130 million pieces of junk are currently circling the planet. Putting large compute constellations into the same crowded sun-synchronous bands of orbit raises obvious collision and resilience concerns.
That is all before you even get to maintenance. Terrestrial data centres are constantly upgraded, repaired, and retrofitted. Orbital hardware is far harder to service and, in many cases, may be effectively disposable after a few years. For any operator, that changes the asset model completely. This is one reason the sector still looks more like an experiment in future infrastructure than a direct replacement for ground-based facilities.
A serious idea, but not a settled one
The case for orbital data centres is strong enough that it will keep attracting capital, talent, and attention. Earth-based infrastructure is running into real constraints around power, water, land, and grid access, while AI demand continues to rise sharply. In that context, space-based compute has moved from being a fringe concept to a serious strategic option.
But it is still exactly that: an option. The companies pursuing it are making an aggressive infrastructure bet on cheaper launches, safer orbital operations, and demand that justifies building a second layer of compute above the planet. Some of those bets may pay off, but others may not. For now, the race to build data centres in space says less about certainty and more about the pressure AI has put on the systems we already have on the ground.
At Harmonic, we work with high-growth tech and AI companies as they build the finance and operational leadership needed to support this kind of scale. If you would like to discuss more you can reach Harman Dhillon at [email protected]