What about the environmental impacts of AI?
Like any industrial activity, the creation and use of AI has an environmental cost. The two concerns that come up most often are greenhouse gas (GHG) emissions and water usage.1 As of early 2025, these impacts are small at the individual level and moderate at the global level, but the latter is expected to increase in the future.
Calculating the impact
The majority of the environmental impact of AI comes from the datacenters used for training and inference. We can approximate the impact of AI by determining the total impact of all datacenters, then multiplying by the fraction used for AI.2
GHG emissions from datacenters
Datacenters use electricity both to run computers and to cool them. Some of that electricity comes from non-renewable sources with high carbon footprints. How much GHG are emitted in generating that electricity?
Sources disagree about the current impact of datacenters on GHG emissions. One source estimates that in 2023 datacenters were using 4% of US electricity3 and responsible for 2% of US GHG emissions. Another source claims that in 2021, datacenters were responsible for about 1% of global GHG emissions. For comparison, aviation produces 4% of global GHGs. The International Energy Agency looked at the numbers for datacenters and AI and was mostly unconcerned.
Water usage from datacenters
Datacenters can use a lot of water for their operations, mostly for cooling through water evaporation.4
The worldwide “water draw” of datacenters (including both cooling and electricity generation) is around 500 billion liters, about 0.02% of global freshwater withdrawal. Globally, this is a small fraction, but individual datacenters are still large users and can strain the water supply of neighboring towns.
Fraction of datacenter activity for AI
Most of the resource use of datacenters is not used to power AI, and the percentage used for AI rather than other applications is hard to estimate, but it is growing. Alex de Vries estimates that in 2023, AI energy use was under 2.5% of total datacenter energy use. Another source estimates that it was 2% in Q1 2024, but that it could grow to 7% by 2025.
Impact of a single LLM query
Such global numbers are not particularly helpful at determining the individual impact of using AI. For instance, should you worry that using an LLM5 to help you write a single email will use an unreasonable amount of power or water?
For popular models, training and inference use comparable amounts of compute, so as a first approximation of the resource cost, we can look only at inference. Earlier models6 had been reported to consume about 4 Wh per page of text produced. A more recent analysis of GPT4, which powers most ChatGPT queries, found that the power draw was about 0.3 Wh for a typical query, one order of magnitude smaller.7 This power draw translates to about 2 ml of water use and 0.2 grams of CO2-equivalent.
Another way to see that the impacts can't be huge is that LLM providers must pay for the electricity and water they use, and they transfer these costs to the customers using their services. If price per token is a good indicator of energy use, which it seems to be, the fact that the cost per token has been sharply dropping is coherent with more recent analysis pointing to lower inference costs. Since it costs users much less than 1 cent to generate a page of text, this suggests that the total monetary cost of both the water and the energy for such a request cannot exceed 1 cent. It is in fact much smaller for most models.
Individual impact
Is that a lot? Not really.8 The request to GPT-3 uses about one hundredth of the energy needed to boil a cup of water, and causes similar GHG emissions, comparable to driving 0.5 meters.
2 ml of water use is about 150,000 times less than growing a head of lettuce, and about ten million times less than producing a kg of beef or chocolate. It is also dwarfed by the water needed to produce everyday items, such as the 2700 L needed to produce a t-shirt.
For the average American, adding 10 000 extra queries per day to ChatGPT (one query every 8 seconds) would increase their total carbon footprint and water usage by about 5% and 2% respectively.
Future AI use
both chip fabrication and datacenters require centralized installations that sometimes cause local stress on water supplies and energy grids. If the number and size of these installations were to grow as they are expected to, impacted localities might have to adapt their infrastructure or simply ban the building of such installations.
Furthermore, AI use is expected to grow, with large investments in AI as well as plans to expand the power grid. If AI becomes fully integrated into every aspect of our societies, even if the impact of single queries is minimal, they might add up to a substantial amount. In any case, these investments suggest that the energy use of AI will increase substantially in the near future.9
Finally, 2024 has seen the rise of reasoning models that use substantially more compute at inference time.10 These models are currently only used for a small fraction of queries, but if this were to change, it might make sense to start looking at the impact of individual queries again.
AI has been moving very fast in the last few years, and we don’t know what the future will bring. New techniques could be developed, and if AGI is reached, the picture might change in unpredictable ways.
Other concerns include air pollution (which correlates with GHG emissions), land use, mining for materials, and electronic waste. ↩︎
We don’t distinguish here between compute, electricity use, cooling, and material use, and assume AI has a similar profile to other uses. ↩︎
The total yearly US electricity consumption has been hovering around 4 * 10^15 Wh for over a decade. ↩︎
Some water is recirculated for cooling; this water is subtracted from total water usage to calculate water draw. ↩︎
We concentrate on LLMs here as they are the most common usage of modern AI. ↩︎
Sasha Luccioni’s 2022 analysis of BLOOM set the bar for 3-4 Wh as a widely-cited number for the energy use of individual queries to LLMs such as ChatGPT. This corresponds to about 20 ml of water per query. Note that the widely-cited 500 ml of water per query was a misrepresentation of that data, the real value with these numbers varied based on the datacenter used but was always much smaller than 500 ml. ↩︎
This can seem surprising since the size of models has generally been growing, but a combination of better training algorithms, more efficient chips and inference efficiency improvements such as mixture-of-experts have been sufficient to push the inference costs down. ↩︎
Masley’s article uses the old 4 Wh energy estimate, so all of its estimates should be reduced by one order of magnitude. ↩︎
The majority of the planned new power plants are using low-carbon or renewable sources. In particular, this planned demand for energy has renewed interest in nuclear energy. ↩︎
Examples include OpenAI’s o1 and o3, Deepseek’s r1 and Google’s Gemini Thinking line of models. ↩︎