Artificial intelligence isn't just changing how we work and live—it's about to fundamentally reshape the global energy landscape. According to a recent report from Zerohedge, OpenAI executives estimate that deploying just 1 gigawatt of AI computing capacity costs around $50 billion. At the same time, Morgan Stanley is forecasting at least 65 GW of new data center demand by 2028. To put that in perspective, we're talking about more than half of California's peak energy consumption.
The warning is stark: without massive infrastructure investment, the U.S. grid might not be able to handle what's coming. The alternatives? Rolling blackouts, electricity prices jumping three to four times current levels, or nuclear reactors scattered across the landscape.
Nvidia's Escalating Power Requirements
Looking at Nvidia's GPU roadmap—based on research from BofA Global Research—the scale of the challenge becomes clear. Each new chip generation dramatically increases how much power a single rack of servers needs. The H100 and H200 chips from the Hopper generation (2022–2024) drew about 700 watts per chip, with racks consuming roughly 35 kW.
The newer GB100 and GB200 Blackwell chips (2024–2025) push that up to 1,350 watts per chip and 162 kW per rack. But that's just the beginning. The upcoming Rubin chips in 2026 are expected to hit 1,800 watts per chip and 300 kW per rack. By 2027, Rubin Ultra could demand 3,600 watts per chip—driving individual racks to 600 kW, or more than half a megawatt. What once required 30–40 kilowatts per rack could soon need ten times that amount. When you multiply that across thousands of racks in massive data centers, the electrical strain becomes almost unimaginable.
What It Takes to Keep Up
If Morgan Stanley's numbers hold, the U.S. needs to build out 65 GW of AI computing capacity in just a few years. That's not just about buying more chips—it means building new substations, upgrading transmission lines, and expanding grid capacity at a pace we've rarely seen. It also means a rapid expansion of renewable energy paired with large-scale battery storage to smooth out supply.
Many experts think nuclear power—whether through small modular reactors or new large-scale plants—will be essential for providing stable, round-the-clock electricity. And on the efficiency side, companies will need to innovate with better cooling systems and smarter chip designs that squeeze more performance out of every watt. Without all of these pieces coming together, AI's growth trajectory is going to slam into a hard ceiling.
Why This Matters
Nvidia's GPUs are the engines behind today's most impressive AI breakthroughs, but they also represent the physical cost of that progress: massive amounts of electricity. The more powerful the chip, the more infrastructure it demands. This means the future of AI isn't purely a question of computing power or software innovation—it's fundamentally an energy problem. The pace of AI development could be determined not by what engineers can build, but by what the power grid can deliver.