Part Three: Fuel and emissions
Now that we have an estimate of the total energy required to run an AI model to produce text, images, and videos, we can work out what that means in terms of emissions that cause climate change.
First, a data center humming away isn’t necessarily a bad thing. If all data centers were hooked up to solar panels and ran only when the sun was shining, the world would be talking a lot less about AI’s energy consumption. That’s not the case. Most electrical grids around the world are still heavily reliant on fossil fuels. So electricity use comes with a climate toll attached.
“AI data centers need constant power, 24-7, 365 days a year,” says Rahul Mewawalla, the CEO of Mawson Infrastructure Group, which builds and maintains high-energy data centers that support AI.
That means data centers can’t rely on intermittent technologies like wind and solar power, and on average, they tend to use dirtier electricity. One preprint study from Harvard’s T.H. Chan School of Public Health found that the carbon intensity of electricity used by data centers was 48% higher than the US average. Part of the reason is that data centers currently happen to be clustered in places that have dirtier grids on average, like the coal-heavy grid in the mid-Atlantic region that includes Virginia, West Virginia, and Pennsylvania. They also run constantly, including when cleaner sources may not be available.
Data centers can’t rely on intermittent technologies like wind and solar power, and on average, they tend to use dirtier electricity.
Tech companies like Meta, Amazon, and Google have responded to this fossil fuel issue by announcing goals to use more nuclear power. Those three have joined a pledge to triple the world’s nuclear capacity by 2050. But today, nuclear energy only accounts for 20% of electricity supply in the US, and powers a fraction of AI data centers’ operations—natural gas accounts for more than half of electricity generated in Virginia, which has more data centers than any other US state, for example. What’s more, new nuclear operations will take years, perhaps decades, to materialize.
In 2024, fossil fuels including natural gas and coal made up just under 60% of electricity supply in the US. Nuclear accounted for about 20%, and a mix of renewables accounted for most of the remaining 20%.

Gaps in power supply, combined with the rush to build data centers to power AI, often mean shortsighted energy plans. In April, Elon Musk’s X supercomputing center near Memphis was found, via satellite imagery, to be using dozens of methane gas generators that the Southern Environmental Law Center alleges are not approved by energy regulators to supplement grid power and are violating the Clean Air Act.
The key metric used to quantify the emissions from these data centers is called the carbon intensity: how many grams of carbon dioxide emissions are produced for each kilowatt-hour of electricity consumed. Nailing down the carbon intensity of a given grid requires understanding the emissions produced by each individual power plant in operation, along with the amount of energy each is contributing to the grid at any given time. Utilities, government agencies, and researchers use estimates of average emissions, as well as real-time measurements, to track pollution from power plants.
This intensity varies widely across regions. The US grid is fragmented, and the mixes of coal, gas, renewables, or nuclear vary widely. California’s grid is far cleaner than West Virginia’s, for example.
Time of day matters too. For instance, data from April 2024 shows that California’s grid can swing from under 70 grams per kilowatt-hour in the afternoon when there’s a lot of solar power available to over 300 grams per kilowatt-hour in the middle of the night.
This variability means that the same activity may have very different climate impacts, depending on your location and the time you make a request. Take that charity marathon runner, for example. The text, image, and video responses they requested add up to 2.9 kilowatt-hours of electricity. In California, generating that amount of electricity would produce about 650 grams of carbon dioxide pollution on average. But generating that electricity in West Virginia might inflate the total to more than 1,150 grams.
AI around the corner
What we’ve seen so far is that the energy required to respond to a query can be relatively small, but it can vary a lot, depending on the type of query and the model being used. The emissions associated with that given amount of electricity will also depend on where and when a query is handled. But what does this all add up to?
ChatGPT is now estimated to be the fifth-most visited website in the world, just after Instagram and ahead of X. In December, OpenAI said that ChatGPT receives 1 billion messages every day, and after the company launched a new image generator in March, it said that people were using it to generate 78 million images per day, from Studio Ghibli–style portraits to pictures of themselves as Barbie dolls.
Given the direction AI is headed—more personalized, able to reason and solve complex problems on our behalf, and everywhere we look—it’s likely that our AI footprint today is the smallest it will ever be.
One can do some very rough math to estimate the energy impact. In February the AI research firm Epoch AI published an estimate of how much energy is used for a single ChatGPT query—an estimate that, as discussed, makes lots of assumptions that can’t be verified. Still, they calculated about 0.3 watt-hours, or 1,080 joules, per message. This falls in between our estimates for the smallest and largest Meta Llama models (and experts we consulted say that if anything, the real number is likely higher, not lower).
One billion of these every day for a year would mean over 109 gigawatt-hours of electricity, enough to power 10,400 US homes for a year. If we add images and imagine that generating each one requires as much energy as it does with our high-quality image models, it’d mean an additional 35 gigawatt-hours, enough to power another 3,300 homes for a year. This is on top of the energy demands of OpenAI’s other products, like video generators, and that for all the other AI companies and startups.
But here’s the problem: These estimates don’t capture the near future of how we’ll use AI. In that future, we won’t simply ping AI models with a question or two throughout the day, or have them generate a photo. Instead, leading labs are racing us toward a world where AI “agents” perform tasks for us without our supervising their every move. We will speak to models in voice mode, chat with companions for 2 hours a day, and point our phone cameras at our surroundings in video mode. We will give complex tasks to so-called “reasoning models” that work through tasks logically but have been found to require 43 times more energy for simple problems, or “deep research” models that spend hours creating reports for us. We will have AI models that are “personalized” by training on our data and preferences.
This future is around the corner: OpenAI will reportedly offer agents for $20,000 per month and will use reasoning capabilities in all of its models moving forward, and DeepSeek catapulted “chain of thought” reasoning into the mainstream with a model that often generates nine pages of text for each response. AI models are being added to everything from customer service phone lines to doctor’s offices, rapidly increasing AI’s share of national energy consumption.
“The precious few numbers that we have may shed a tiny sliver of light on where we stand right now, but all bets are off in the coming years,” says Luccioni.
Every researcher we spoke to said that we cannot understand the energy demands of this future by simply extrapolating from the energy used in AI queries today. And indeed, the moves by leading AI companies to fire up nuclear power plants and create data centers of unprecedented scale suggest that their vision for the future would consume far more energy than even a large number of these individual queries.
“The precious few numbers that we have may shed a tiny sliver of light on where we stand right now, but all bets are off in the coming years,” says Luccioni. “Generative AI tools are getting practically shoved down our throats and it’s getting harder and harder to opt out, or to make informed choices when it comes to energy and climate.”
To understand how much power this AI revolution will need, and where it will come from, we have to read between the lines.
#math #AIs #energy #footprint #Heres #story #havent #heard