Google’s custom-designed AI chips, the Tensor Processing Units (TPUs), are the engine behind its AI success on Earth. Now, the company is planning to send them into orbit as the “brains” of its “Project Suncatcher” space-based datacenters.
The company’s new “moonshot” research envisions “compact constellations of solar-powered satellites, carrying Google TPUs.” These chips are optimized for the specific kinds of calculations needed for training and running AI models, making them ideal for this new generation of orbital computing.
This move signifies Google’s intent to create a dedicated, high-performance AI network in space, not just a general-purpose computing platform. By using its own proprietary hardware, Google can optimize the entire system, from the solar panels to the processing units, for maximum efficiency.
This strategy contrasts slightly with its competitors. Nvidia, a chip seller, is partnering with Starcloud to launch its AI chips into space. Google, on the other hand, is acting as its own chip provider, systems integrator, and (eventually) service operator, much like it does on Earth with Google Cloud.
The 2027 prototype launch will be a critical test for these TPUs in a new, harsh environment. Engineers will need to solve how to cool the high-performance chips in a vacuum and protect them from radiation, all while they are interconnected by laser-based optical links.
