Nvidia Wants to Build AI Data Centers in Space: Space-1 Vera Rubin Unveiled at GTC
AI infrastructure leaves Earth
At Nvidia's GTC 2026 conference, CEO Jensen Huang announced something that previously existed only in science fiction: Vera Rubin Space-1, a computing module designed for orbital data centers. Offering up to 25 times more AI compute than the H100 chip, Space-1 is intended to take AI inference into orbit around Earth.
Technical challenges in space
Huang acknowledged the challenges openly: in space there is no conduction or convection – only radiation to cool the systems. Nvidia is working with partners to solve the thermal management problem.
Why space?
Orbital data centers offer unique advantages:
- Latency: Processing close to satellites and space-based sensors
- Independence: No national jurisdiction over infrastructure
- Scaling: Unlimited expansion without geographical constraints on the ground
- Defense sector: Direct support for space-based intelligence and surveillance
From GPU maker to AI infrastructure company
Space-1 illustrates Nvidia's transformation – from selling GPUs to defining the future of AI infrastructure across Earth and space. Jensen Huang stated at GTC: "We're working with our partners on a new computer that will go out to space and start data centers out there."
Implications for CIOs
For now this is mostly strategic signaling, but it shows the direction AI infrastructure is heading. Future cloud platforms could include space-based nodes. CIOs should monitor developments – particularly in defense, aerospace, and latency-sensitive industries.
📬 Likte du denne?
AI-nyheter for ledere. Kuratert av en CIO som bygger det selv. Daglig i innboksen.