Have you ever stopped and wondered if the lights are about to go out—not just in your house, but everywhere?
I did, and it really sunk in for me when I saw the below chart.
Between 2024 and 2030, electricity demand from data centers in the U.S. is set to explode by 400 terawatt-hours—that’s 400 trillion watt-hours (!!!), growing at 23% every year. To put this into perspective, that’s like powering 36 million homes nonstop for a year. Or, if you really want to get a feel for it; that’s 10% of the entire United States’ annual electricity generation.
We’re not just talking about a little extra juice here—we’re looking at an alarming trend that’s growing faster than we could ever expect.
But, here’s the pain point that got me interested in this trend in the first place: that our grid wasn’t built for this accelerated growth.
If you look around us, we’ve got AI helping us with our lives, self-driving cars zipping around, and an internet that doesn’t really sleep—all powered by thousands of warehouses stuffed with servers demanding this energy.
And it’s not slowing down—it’s accelerating so fast I think we might be hitting a wall.
So can we actually keep up, and will we have enough energy to stop the whole energy grid from crashing?
I had to dig deeper, because the stakes are getting pretty high right now.
Let me take you through what I found…
How it Started
Let’s zoom out for a second.
Global energy demand has been climbing for decades—more people, more gadgets, more everything.
Over the past 200 years, energy demand has gone from a trickle to a tsunami, especially since the 1950s.
Then the 1990s hit, and the internet flipped the script. Suddenly, we could connect instantly, binge cat videos, and Google just about anything.
You see, every click, every stream, every “Hey Siri”, or every ChatGPT prompt comes from massive data centers—think of these as giant warehouses packed with servers that never shut off.
Back in the early days, companies like Google, Amazon (AWS), and Microsoft scrambled to build these server farms as we all jumped online.
I remember uploading my first photo to the cloud, thinking, “Cool, it’s just there now.” But it’s not “just there”—it’s sitting in a giant facility somewhere, and those facilities have ballooned into energy hogs.
Today, they handle everything from your Netflix queue to your Zoom calls, and they’re not small anymore—they’re these huge technology companies demanding more energy every year.
This hunger isn’t tapering off — it’s speeding up, and there are three areas which are driving it straight into overdrive.
The Three Hogs
Cloud Computing.
Instead of purchasing and maintaining your own physical servers, you use the infrastructure of providers like AWS, Microsoft Azure, or Google Cloud.
Streaming music, backing up your phone, video-calling your mom—all that runs 24/7 on data centers that need constant power for computing, cooling, and keeping your data safe. They store data, process requests, and keep everything online.
The demand for cloud services is exploding. Businesses are ditching local servers for the flexibility of the cloud, and consumers are streaming more content than ever.
The global cloud computing market is projected to jump from $445 billion in 2020 to over $1 trillion by 2028.
But with each new data center created, it adds to the energy load, and with trends like edge computing (bringing data processing closer to users), smaller facilities are popping up everywhere, multiplying the impact.
Data centers already account for 1-2% of global electricity use, and some experts predict this could climb to 8% by 2030 if growth continues unchecked.
Crypto Mining.
The Bitcoin network alone is a huge energy consumer.
Bitcoin miners use specialized hardware, like high-powered GPUs or custom rigs, to solve complex problems that validate transactions on a blockchain. The first miner to crack the puzzle earns a reward, but this competition burns through electricity. A single Bitcoin transaction can use as much energy as an average U.S. household does in a month.
The global Bitcoin network chews through 175 terawatt-hours a year—enough to rival entire countries like Egypt or South Africa.
I used to think crypto was just digital money, but it’s a juggernaut when it comes to energy demand, and with the U.S. (under Trump) aiming to be the “crypto capital of the world,” it’s only getting hungrier.
The crypto boom has turned mining into a global industry.
Places like Texas are becoming mining hubs thanks to cheap electricity, but this strains local grids and raises environmental concerns. And with new coins and blockchain projects emerging, the collective energy footprint keeps growing
By the way, this isn’t only happening with Bitcoin; other cryptocurrencies like Ethereum are also playing a role.
Artificial Intelligence.
The Big Kahuna.
Training billion-parameter models—like ChatGPT or Google’s Gemini—takes hordes of GPUs.
Building an AI model involves “training” it on huge datasets—think millions of images, texts, or sensor readings. This process, often using deep learning, requires powerful computers to crunch data for days or weeks.
A single AI query can burn 10 times the energy of a Google search.
From social media algorithms to factory automation, its applications are multiplying. Each new model pushes the boundaries, requiring more data and more power.
Companies are racing to build bigger, smarter AIs, and the energy cost scales up fast.
AI Hyperscalers: The Giants Breaking the Grid
But let’s keep going on the AI thread and zero in on AI Hyperscalers, because I think it’s the most interesting and notable of the three.
We should first talk about the AI hyperscalers—Google, Oracle, Amazon, Microsoft, Meta—these tech titans aren’t just running clouds anymore; they’re building AI empires. Their new data centers are designed to crank through insane models with thousands of NVIDIA GPUs.
The graph below shows the crazy amount of money these hyperscalers are spending on infrastructure. You can see a noticeable jump in 2022 when Generative AI became widely available.
One hyperscale facility can consume over 100 megawatts—enough to light up a small city. Using real-time automation, modular designs—they’re scaling fast and trying to squeeze efficiency out of every watt.
But here’s the kicker: they’re slowly pushing our grids to their limits.
In places like Northern Virginia, a data center hotspot, the buildup there is an influx of data center companies building their presence as fast as possible to keep up with the demand.
I used to think the grid was this invincible thing, but these hyperscalers are proving it might be fallible.
So, what do we do?
How can we find ways to meet this energy demand in a sustainable way?
N.u.c.l.e.a.r.?
Currently, the grid is mostly powered by fossil fuels—coal, gas—with some renewables like wind and solar pitching in. But fossil fuels spew carbon, and renewables flake out when the sun’s down or the wind stops.
We know that data centers need power 24/7, and cooling them alone can eat up to 40% of their energy. So, we’re kinda stuck with the current energy sources we’re comfortable using—but there is something left:
Nuclear?
Yeah, nuclear. I’ll admit, I used to freak out at the word—Chernobyl, Fukushima, huge disaster vibes. But hear me out: today’s nuclear isn’t your old-school giant meltdown machine.
It’s back because it’s actually a carbon-free powerhouse. It runs nonstop, doesn’t care about weather, and packs more energy density than anything else.
The U.S. Department of Energy says nuclear plants churn at max power 92% of the time— renewables can’t touch that.
And it looks like big tech is already working on it: Microsoft’s locked in a 20-year deal with Constellation Energy for 800 megawatts of clean nuclear by 2028. And, Google’s teaming up with startups like Kairos Power for small modular reactors (SMRs).
These aren’t the hulking old reactors—SMRs are smaller, safer, built in factories, and shipped like IKEA kits. They’ve got passive safety systems that shut down solo. You should check this article out for more info on SMRs or watch the video below:
And then there’s also fusion—still a pipe dream (for now), but companies like Helion Energy are chasing it. FYI, Helion is backed by OpenAI’s Sam Altman - I think you can see where this is going
Here’s a quick intro video into Nuclear Fusion:
AI Could Help
But, here’s a twist: AI might be able to help in fixing this energy crisis.
MIT’s using AI to design chips and algorithms that slash energy use. It’s optimizing data center cooling and even helping dream up next-gen reactors.
There are strong cases to be made where AI itself can contribute to solving these problems and I don’t think they should be ruled out.
We’ve seen cases where companies like NVIDIA are using AI to help them build out more energy-efficient hardware chips and software tools to help make their products more optimal and performant.
The Future
The truth is, global energy demand’s only going up and it’s not slowing down.
I don’t think we’ll run out of juice tomorrow, but we’re teetering on a ledge right now.
I’m just going to say it — Nuclear’s my bet—it’s reliable, low-carbon, and ready to scale. Sure, it doesn’t have the best history, but which energy source does. At the end of the day, I think we’ve got to quit bickering and start talking about it, because the clock’s ticking.
Here’s where I land: this energy crunch isn’t a death sentence—it’s a spark.
Constraints breed innovation.
AI’s hunger is forcing us to rethink everything, from reactors to grids to how we live.
The future isn’t about keeping the lights on anymore, it’s all about leaning into the problems we currently face and using technology to come up with sustainable solutions that balance both the environment and this energy demand.
Connect with Me
Are you new to the newsletter? Subscribe Here
Learn more about me on my website
Check out my YouTube channel (and subscribe!)
If you’re a founder, apply here (Metagrove Ventures) for startup funding or contact me directly at barry@metagrove.vc
Thanks for reading.
If you like the content, feel free to share, comment, like and subscribe.
Barry.