
It’s been more than 50 years since humankind last stepped foot on the Moon, and the idea of ferrying humans to Mars still seems out of reach. Yet listen to Silicon Valley executives and they’ll tell you they have aspirations to do more than just set foot on far-flung planets and celestial bodies. Their latest frontier? They want to build data centres there to power the AI revolution.
Amazon’s Jeff Bezos has said that he believes “colossal data centres” will be built in space within the next decade or two — and he favours a space-based regime because it can take advantage of endless solar power, 24 hours a day, seven days a week. “There are no clouds and no rain, no weather,” he said.

Elon Musk’s company SpaceX has grand plans for space data centres
PA Wire
He’s far from alone. Elon Musk plans to deploy data centres in space through his SpaceX company. He also said that space could supply 100TWh (terawatt-hours) of electricity a year, or around one-third of all the UK’s annual demand, through “a lunar base producing solar-powered AI satellites locally”.
And it isn’t just the tech bros: the European Union advocated for data centres in space as early as 2022.
Right now, “data centre in space” is as much a branding exercise as an engineering plan. Most concepts imagine a cluster of satellites or a modular station in high orbit, packed with chips and solar panels, crunching numbers far from prying eyes.
Others assume you would simply harvest power in space and beam it back to Earth, keeping the servers grounded. Each approach raises different headaches, from launch costs and orbital debris to who is legally responsible if something goes wrong above another country’s territory.
Satisfying power-hungry AI
And yet, it is important to solve the puzzle of meeting the massive power demand from the AI revolution. Energy demand for data centres will more than double to up to 945TWh by 2030, according to the International Energy Agency.
Building the infrastructure to meet that demand on Earth will be tricky, and comes hand in hand with troubling human and environmental pressures. According to Structure Research’s 2025 ESG report, water consumption in data centres has risen by 9.6 per cent over five years, mainly driven by AI-intensive liquid cooling.

CGI image of Fermi’s “Project Matator” energy and data centre development in Texas, back on Earth
Fermi
“In theory, space offers near-limitless solar power and a cooling environment free from Earth’s water and land constraints,” says Stuart Farmer, co-founder and head of sales at Mercury Power, which supports the data centre industry. But that’s in theory. In practice, “we’re still a long way from making it practical or sustainable”, he explains.
The focus on out-of-this-world innovation may well be misguided, reckons Farmer, and could scupper developments closer to home. “There’s a lot of room to innovate here on Earth,” he says. That could include improving power efficiency, rethinking cooling systems and reducing the carbon footprint of existing data centres.
“That’s where we can make a real impact now,” he adds.
Operators are experimenting with putting big server farms next to wind and solar arrays, or in cold climates where outside air does most of the cooling. Some are piping waste heat into district heating networks or swimming pools, turning a liability into a selling point. Others are rewriting software so that the most power-hungry AI training jobs run when electricity is cheapest and cleanest, rather than simply when tech companies want to develop their newest models.
“Now” doesn’t really mean now when it comes to data centres in space. “There are so many complications in there, I don’t think we’re anywhere near this going to be a realism this decade,” says Matt Hawkins, founder of CUDO Compute, who works on the key infrastructure that powers GPUs, the computer chips behind AI models. Look just a little longer term, however, and there are real possibilities. “I think in the 2030s, definitely this can happen,” he adds.
Bringing that to reality would take a lot of work — and would need to be fine-tuned to pull it off at the distances involved. “Most AI work is latency sensitive,” says Camden Woollven, an AI subject matter expert at GRC Solutions. Latency here means the time between a user doing something and the technology responding. Latency is reduced by bringing down the distance that data has to travel to data centres from devices and back — something that becomes enormous if you start to consider the length of travel from here to the next celestial body.

An AI Energy Council will meet for the first time on Tuesday (PA)
PA
“If the model has to wait an extra second for a round trip, the whole thing drags,” says Woollven. “Even the cleanest satellite link adds a delay you cannot ignore, never mind the bandwidth you would need to push training data up there in the first place. So the idea that we will shift mainstream AI workloads off-planet any time soon is wishful at best.”
Farmer points out that latency is just the start of the list of issues that would need to be resolved in order to make space-based data centres feasible. “Launching infrastructure into orbit has an enormous carbon cost, and once it’s up there, systems must survive radiation, micro-meteorites and thermal extremes,” he says. “We still don’t have clear answers on how the design, power storage, maintenance or lifecycle impact would be managed in this environment.”
Nevertheless, the idea that this is being considered at all speaks to the shortage of supply here on Earth — something that needs to be corrected quickly if the AI revolution is to continue apace.

