Welcome to Eye on AI, with AI reporter Sharon Goldman. In this edition: Data centers in space are feasible, but not ready for launch…Accenture links promotions to AI logins…AI pioneer Fei-Fei Li’s startup World Labs raises $1 Billion. Nvidia’s deal with Meta signals a new era in computing power.
The AI industry is on a power trip—literally–and it’s getting desperate. Data centers already account for roughly 4% of U.S. electricity use, a share expected to more than double by 2030 as running and training AI models increasingly require gigawatts of power. Analysts project global data-center power demand could rise as much as 165% by the end of the decade, even as new generation and transmission infrastructure lag years behind need. In response, hyperscalers are scrambling—cutting deals to build their own gas plants, exploring small nuclear reactors, and searching for power wherever they can find it.
Against that backdrop, it’s not surprising that some of the industry’s biggest players are starting to look to outer space for a solution.
In a feature story published this morning, I dig into how—even as tech companies are on track to spend more than $5 trillion globally on Earth-based AI data centers by the end of the decade—Elon Musk is arguing the future of AI computing power lies in space, powered by solar energy. Musk has suggested that the economics and engineering could align within just a few years, even predicting that more AI computing capacity could be in orbit than on Earth within five.
The idea of orbital space centers itself isn’t new. As far back as 2015, Fortune was already asking the question: What if we put servers in space?
What’s changed is the urgency. Today’s power crunch has pushed the concept back into serious conversation, with startups like Starcloud getting attention and Big Tech leaders like former Google CEO Eric Schmidt, Alphabet CEO Sundar Pichai, and Amazon’s Jeff Bezos all turning their attention to the possibilities of launching data centers into orbit.
However, while Musk and other bulls argue that space-based AI computing could become cost-effective relatively quickly, many experts say anything approaching meaningful scale remains decades away. Constraints around power generation, heat dissipation, launch logistics, and cost still make it impractical—and for now, the overwhelming share of AI investment continues to flow into terrestrial infrastructure. Small-scale pilots of orbital computing may be feasible in the next few years, they argue, but space remains a poor substitute for Earth-based data centers for the foreseeable future.
It’s not hard to understand the appeal, though: Talking with sources for this story, it became clear that the idea of data centers in space is no longer science fiction—the physics mostly check out. “We know how to launch rockets; we know how to put spacecraft into orbit; and we know how to build solar arrays to generate power,” Jeff Thornburg, a SpaceX veteran who led development of SpaceX’s Raptor engine, told me. “And companies like SpaceX are showing we can mass-produce space vehicles at lower cost.”
The problem is that everything else, from building massive solar arrays to lowering launch costs, moves far more slowly than today’s AI hype cycle. Still, Thornburg said in the long run, the energy pressures driving interest in orbital data centers are unlikely to disappear. “Engineers will find ways to make this work,” he said. “Long term, it’s just a matter of how long is it going to take us.”
With that, here’s more AI news.
Sharon Goldman [email protected] @sharongoldman
The post AI is running out of power. Space won’t be an escape hatch for decades appeared first on Fortune.




