● Microsoft CEO Satya Nadella recently made a surprising admission: the company has Nvidia GPUs already installed in data centers that can't be turned on because there simply isn't enough power to run them. This revelation, highlighted in an analysis by Shay Boloor, exposes a fundamental challenge facing the AI industry today.
● The bottleneck has shifted. It's no longer about getting enough chips—it's about securing enough electricity and data center infrastructure to actually use them. As Boloor points out, "If compute is easy to buy but power is hard to get, the leverage moves to whoever controls energy and infrastructure."
● Each new hyperscale data center from Microsoft, Google, Amazon, Meta, and Oracle needs hundreds of megawatts of stable power. But bringing that capacity online takes years, creating a timing problem: new GPU generations arrive so fast that older hardware loses value before it can generate meaningful returns.
● means AI growth now depends on how quickly companies can secure energy contracts and power up capacity—not just how many chips they can buy. Companies that built energy infrastructure early have gained a major strategic advantage.
● Nadella's comments confirm what investors are starting to understand—the next era of AI won't be built on silicon alone. It will run on electricity.
Usman Salis
Usman Salis