Google’s wild plan to put AI data centers in space

Google's wild plan to put AI data centers in space - Professional coverage

According to Ars Technica, Google has confirmed it’s working on Project Suncatcher, a moonshot initiative to deploy AI data centers in space using orbiting TPUs. The company plans to launch its first pair of prototype satellites with TPUs by early 2027 and is banking on launch costs dropping to as low as $200 per kilogram by the mid-2030s. Google’s research shows solar panels in orbit could be up to eight times more efficient than Earth-based systems, solving the massive energy demands of AI compute. The satellites would operate in dawn-dusk sun-synchronous low-earth orbit to maximize sunlight exposure, and early testing has demonstrated bidirectional communication speeds up to 1.6 Tbps. Google is currently radiation-testing its latest v6e Cloud TPU (Trillium) using 67MeV proton beams, finding the hardware can handle nearly 2 krad before data corruption occurs.

Special Offer Banner

Sponsored content — provided for informational and promotional purposes.

<h2 id="space-is-hard”>The physics problem nobody’s talking about

Here’s the thing about putting data centers in space: physics doesn’t care how cool your idea is. Google‘s own research acknowledges that received power decreases with the square of distance, meaning these satellites would need to maintain formations within a kilometer of each other. That’s way tighter than anything we’ve ever deployed at scale. Starlink satellites typically operate hundreds of kilometers apart, not hundreds of meters.

And let’s talk about that “modest station-keeping” they mention. Maintaining precise formations in orbit isn’t like parking cars – you’re dealing with orbital mechanics, atmospheric drag, and the reality that every correction burn uses precious fuel. What happens when one satellite drifts out of position and can’t maintain that terabit-speed link? Suddenly your distributed AI compute cluster becomes… less distributed.

When your TPUs get cosmic sunburn

Google says their TPUs need to survive at least five years in space, which works out to about 750 rad of radiation exposure. Their testing shows they can handle nearly 2 krad before data corruption. That sounds impressive until you remember space radiation isn’t consistent – solar flares can deliver years’ worth of radiation in hours.

They’re testing with proton beams, but space throws everything at you: cosmic rays, heavy ions, electrons. And their approach of using off-the-shelf terrestrial hardware? That’s either incredibly bold or borderline reckless. Sure, the Mars helicopter used consumer chips, but it also had the entire NASA engineering team babysitting it. Google planning to deploy thousands of these things?

The launch cost fairy tale

They’re banking on launch costs dropping to $200/kg by the mid-2030s. That’s the kind of optimistic projection that makes venture capitalists swoon, but SpaceX’s current Falcon 9 costs are around $2,700/kg for dedicated launches. Even Starship would need to achieve unprecedented reusability and flight rates to hit those numbers.

And here’s what nobody mentions: when your satellite fails after three years instead of five, you’re not just losing the hardware – you’re paying to launch the replacement too. Terrestrial data centers might be “dirty, noisy, and ravenous for power” as Google says, but at least you can drive a truck to them when something breaks.

Why Google’s even considering this

Look, the energy math is compelling. Eight times more solar efficiency? Constant sunlight? That solves real problems for an industry that’s hitting power constraints everywhere. And communities are increasingly rejecting new data centers over water usage and noise concerns.

But this feels like Google doing what Google does best: thinking so far outside the box that they forget the box exists. The technical challenges are immense, the economics are speculative, and the timeline feels… optimistic. Still, you can see why they’re exploring it in their research division. If anyone has the resources to try this crazy stuff, it’s them.

Basically, Project Suncatcher is either the future of AI infrastructure or the most expensive science experiment ever conducted. Given Google’s track record with moonshots, it could go either way. But one thing’s for sure – if they pull this off, they’ll have solved AI’s energy problem by literally leaving Earth’s problems behind.

Leave a Reply

Your email address will not be published. Required fields are marked *