Why does AI use so much electricity? Though we touched on this in part one of this series, let’s dig a little deeper. AI now—and more power-hungry artificial general intelligence (AGI) applications in the future—are energy hogs.
AGI energy is the electricity required to train large models, run them at scale, and produce the chips and data centers that make it all possible. By ‘large models,’ we mean AI systems with billions of parameters trained on massive datasets that enable richer results but use more servers and more electricity.
Headlines focus on AI breakthroughs—new chatbots, copilots (AI assistants embedded in everyday tools that help draft, code, or analyze), and multimodal models (systems that can understand or produce text, images, audio, or video)—but behind the scenes, the hunger for power is accelerating.
It is mind-boggling to realize that training a single AI frontier model—one of today’s most capable, state-of-the-art systems trained at massive scale—can consume thousands to tens of thousands of megawatt-hours, comparable to the annual electricity use of hundreds to many thousands of homes, depending on the setup. And that’s just the beginning. Once a model is trained, it has to be deployed, maintained, and scaled to billions of users. Every one of those steps adds another layer of power demand.
This is why voices like Eric Schmidt warn of a 90-gigawatt gap in U.S. capacity, while Elon Musk calls electricity generation “the fundamental limitation on AI.” Without new power, AGI may advance more slowly than the hype suggests.
Power-hungry model training
At the front of the line in AGI energy consumption is model training. Think of hundreds of billions of tunable settings trained on big rooms full of servers powered by thousands of AI chips. Plus, each new AI product generation often increases the energy footprint, though efficiency gains can blunt the growth, as training runs stretch into weeks or months. Training is spiky—massive bursts of demand for a single project—but those bursts are among the hungriest workloads on the planet.
What is inference?
Inference is when a trained model is used to produce an answer—writing a reply, summarizing a document, labeling an image, or turning speech into text. Training teaches the model; inference is the moment it does the work.
Inference at scale is the same process multiplied across millions or billions of requests. Each individual step is small, but together they create a steady, around-the-clock draw in data centers. Multiply one quick query by billions of users and you get continuous power use—and as copilots spread through office software, customer service, and personal assistants, the result is a persistent baseline load across work hours and time zones.
The physical side of AI energy use
AGI energy isn’t just about powering AI systems to answer requests. It’s also about producing the underlying infrastructure for AI services. Fabricating graphics cards and custom AI chips is energy-intensive, from semiconductor fabs to packaging. Building an industrial-scale data center involves steel, concrete, and silicon, each with its own energy cost. Cooling those facilities adds additional significant cost.
In other words, the demand story isn’t only digital. It’s physical, industrial, and global.
Which AI products drive the most power demand?
Think in layers—and think in terms of capability. By capability, we mean usefulness you can feel—answers that hold up, drafts that help, code that runs, images that fit, and actions the system can take on your behalf. Products that turn electricity into capability include:
- Office helpers for teams — Think Word, Excel, email, coding help, and customer support—now with an AI “brain” that responds instantly and frequently throughout the day. These “helpers” create a persistent baseline load across data centers.
- Everyday helpers for people — These are chat, voice, and camera apps that give you quick answers, reminders, and step-by-step help. The sheer volume of tiny, fast requests at global scale adds up to significant continuous consumption.
- Lab and shop tools for product research and development — These tools discover drugs, test new materials, and run large simulations—driving long training and batch compute jobs.
- Robots and machines that do real work — Factory cells, delivery bots, cars, and field gear that see, plan, and act in the real world. Unlike R&D tools, these systems add real-time perception and planning workloads and often compute continuously while operating.
- Search and media engines — Systems that answer questions and create text, images, audio, and video. They draw big bursts of power during peak periods, such as news cycles and product launches, plus a steady baseline the rest of the time.
The new hierarchy of energy demand
If we would rank AI’s appetite for electricity, it looks like this:
- Teaching the system (training) — These are the big one-time classes where models learn. They use huge spikes of power while the training runs.
- Answering all day (inference at scale) — These are the functions that happen after training, covering billions of daily questions and tasks. This results in steady energy demand that keeps growing.
- Practicing jobs in a sandbox (planning and simulation) — These include running “what if” drills to design products, routes, and schedules. These tasks add another layer of load to AI-assisted work.
- Making and cooling the gear (manufacturing and operations) — These include building microchips and running data centers. This creates a steady 24/7 draw for data centers and near-continuous operations at fabrication factories.
Why these factors matter to AGI energy
AGI energy is not theory. It is the fuel bill for tomorrow’s intelligence. Breakthroughs will keep coming, and each one brings a bigger electric tab—first kilowatts, then megawatts, and eventually gigawatts.
The bottom line is this: power planning is now central to the AI story and (coming soon) the AGI story. Without enough electricity, AI is a race car with nowhere to drive. With it, the road opens, and progress is limited mostly by imagination.
However, AGI energy is not an all-or-nothing equation. There is a lot of in-between involved, ranging from quite possible outcomes to out-and-out impossible AI fantasies.
From already under-construction AI projects to pie-in-the-sky fantasies, we’ll examine the full spectrum of future AI in part three of this series.







Leave a Reply