What about the environmental impacts of AI?

Like any industrial activity, the creation and use of AI has an environmental cost. The two concerns that come up most often are greenhouse gas (GHG) emissions and water usage.1 As of 2025, these impacts are small at the individual level and moderate at the global level, but the global impacts are expected to increase in the future.

What counts as “energy used by AI?”

AI is a broad field that includes LLMs but also prediction and recommender systems, web search, ad targeting and computer vision.2 When talking about “energy use from AI,” sources may implicitly include all of these or exclude some, which makes it hard to compare numbers or understand which of these subcategories are more energy-intensive. Large companies that train AI generally publish their total energy use, but rarely provide a detailed breakdown of how much energy went to AI. Similarly, they do not distinguish between training the models and “inference”, i.e. running the models.3

Furthermore, different usages of generative AI vary greatly in their energy usage. For instance, image generation is about as energy intensive as text generation, but video generation4 use substantially more energy than generating text.

All that is to say that there is no single number for “how much energy does AI use,” many of the numbers are not public, and the data we have is full of holes. We concentrate here on best-guess estimates of “what are the individual and global environmental impacts of using an LLM like ChatGPT to generate text?”5 since this is a question that has received a lot of media attention.

Calculating the impact

The vast majority of AIs are run in datacenters, which require electricity and water to operate. Most of the research on the environmental impacts of AI focuses on power use, but power use can be used as a basis for estimating water use and GHG emissions as well.

Datacenters emit very few GHGs on their own, and the majority of emissions come from energy generation. While this emissions rate can vary substantially by country and even by state, we'll use here the American average of 0.4 g of CO2 per Wh.

Water usage includes direct cooling,6 where water is evaporated to cool the computers, but also the water consumed in generating the electricity. The US Department of Energy found that about 5 mL of water are evaporated per Wh of energy used by datacenters.

So we can estimate that using 1 Wh in a US datacenter will produce 0.4g of CO2 and draw 5 mL of water. With that in mind, we can look at the power use of LLMs and calculate the impact on emissions and water use.

Impact of individual LLM queries

Let’s start small and check the energy use of a single LLM query. For instance, should you worry that using ChatGPT to help you write a single email will have an unreasonable impact?

For popular models, running the model over its lifetime uses more compute7 8 than training the model, so we’ll ignore the training compute here and concentrate on inference. Estimates of how much energy is needed to run the models varies, but most are under 3 Wh, often by a lot. Still, we can use this figure as an upper bound here9 10, which 11 translates to about 20 ml of water use12 and 1 g of CO2-equivalent.13

Is that a lot? Not really. The request to GPT-4o uses about one hundredth of the energy needed to boil a cup of water, and emits a similar amount of GHGs to driving 0.5 meters.

We often underestimate how much water is used in everyday activities and products we buy. The 20 ml of water used to power a single ChatGPT query is about 15,000 times less than growing a head of lettuce, and about one million times less than producing a kg of beef or chocolate. It is also dwarfed by the water needed to produce everyday items, such as the 2700 L needed to produce a t-shirt. The claim that ChatGPT uses 10x the energy of a traditional Google search is likely true but misleading, and both have a minuscule effect compared to other personal choices.14

For the average American, adding 1000 extra queries per day to ChatGPT (one query every 8 seconds) would increase their total carbon footprint and water usage by no more than 5% and 0.5% respectively. One would need to query ChatGPT every 10 seconds to match the power and water usage of watching TV15, which is itself orders of magnitude smaller than the effects of eating a beef hamburger.

LLM use through APIs

Companies that offer LLMs through a web interface or app such as ChatGPT often also serve their products in less visible ways through APIs to enterprise customers. These can power applications such as web searches, customer support bots, helper bots in software, and coding agents.16 We don’t know exactly how the number of such queries compares to ChatGPT queries17, but as of 2024 these queries accounted for about 20% of OpenAI's revenue and 75% of Anthropic’s Q2 2025 revenue.

There is a lack of published data on this subject, but it looks like as of 2025 API use is comparable to more visible, ChatGPT-like queries. It’s worth noting that such API usage is increasing quite quickly.

Broader impact

Taking a broader view, what's the total environmental impact of AI today?18 We must first look at the impact of datacenters more broadly.

The number of datacenters in the US19 has been increasing recently, in part due to increased use of AI. In 2023 it was estimated that datacenters were responsible for 4% of total electricity consumption20 21 and 2% of GHG emissions in the US. Regarding water, the Lawrence Berkeley National Laboratory calculates that datacenters in the US consumed about 800 billion liters in 2023, which is about 0.5% of total US water draw.

But even now, most of the resource use of datacenters is not used to power AI. Estimates of what percentage datacenter’s energy is used for AI vary a lot22, but it seems unlikely that it exceeded 10% in 2024. If the impact of AI is an order of magnitude less than the impact of all datacenters, that means it’s responsible for 0.2% of US GHG emissions and 0.05% of water draw.

Again, is that a lot? It depends on what benefits this use of AI generates. For people who see AI as fully negative, any environmental impact at all is unacceptable. But most people who use AI do so because they get something out of it, and almost every aspect of using the internet uses AI in some way, so it’s hard to be online without using AI at least indirectly. Overall, its global impact is comparable to that of some aspects of modernity that we take for granted such as yard work, clothes dryers or Christmas lights.

Impact on communities

The environmental cost may be manageable both at the global and individual level, but it can be severe at the local level, as both chip fabrication and datacenters require centralized facilities that sometimes cause local stress on water supplies and energy grids. The number and size of these facilities is expected to grow, and impacted communities will have to adapt their infrastructure or simply ban the building of such facilities.

Future AI use

As we have seen, the environmental impact of individual queries is small and the overall impacts are modest

as of 2025, but AI usage is growing fast as it becomes more integrated in our everyday lives. 2024 has seen the rise of reasoning models that use substantially more compute at inference time. Right now, these models are probably only used for a small fraction of queries, but if this were to change, it might make sense to start looking at the impact of individual queries again. Similarly, the use of LLMs through the API is rapidly increasing, and if eventually an LLM gets queried everytime we take an action online, and if AI agents become ubiquitous, the small impact of each query could add up to be substantial. The use of ChatGPT will continue to have a small impact, but there is broad agreement that the overall environmental impact of AI is poised to grow.

Companies that provide AI are also expecting usage to keep growing. Investment in datacenters for AI is increasing and there are plans to expand the power grid23 to power these new datacenters.24 From which we can infer that the overall environmental impact of AI will increase.

We don’t know exactly what the future will bring. New techniques could be developed that could increase or decrease the energy use, and more fundamentally, if AI ends up transforming society, the picture might change in unpredictable ways.


  1. Other concerns include air pollution (which correlates with GHG emissions), land use, the impacts of mining for materials, and electronic waste. ↩︎

  2. We don’t know how much of energy use for AI is for LLMs; Andy Masley expects it to be under 3%, but there has been little research on this subject. ↩︎

  3. There are some efforts to make this data more transparent. ↩︎

  4. Perhaps surprisingly, image generation is not generally more energy intensive than text generation. ↩︎

  5. ChatGPT is a frontend to many models; we consider here the standard model, GPT-4o. The impact of reasoning models such as o3 is likely substantially higher, but is not well studied. ↩︎

  6. Some water is recirculated for cooling; this water is subtracted from total water usage to calculate water draw. ↩︎

  7. The training of frontier models only makes use of a small percentage of all the available GPUs these companies have access to. This suggests that there might be a substantial amount of compute that is used in experiments that are not ultimately used in the models they publish. Or the extra compute may be intended for some other purpose e.g. preparation for training larger AIs in future. ↩︎

  8. Epoch has found that training and inference compute are comparable, whereas others find that inference is 80-90% of compute. ↩︎

  9. The energy use per query seems to be going down, which can seem surprising since the size of models has been growing, but a combination of better training algorithms, more efficient chips, model distillation and inference efficiency improvements such as mixture-of-experts have been sufficient to push the inference costs down. ↩︎

  10. Julien Delavande of Huggingface built a tool to check the real-time use of models, using open weight model Qwen2.5-7B-Instruct. ↩︎

  11. To simplify, we only consider the power usage in the datacenter, ignoring the usage of the client device. ↩︎

  12. The widely-cited 500 ml of water per query is a misrepresentation, the real value with these numbers varies based on the datacenter used but was always much smaller than 500 ml. ↩︎

  13. We can do a sanity check on these small numbers by observing that LLM providers must pay for the electricity and water they use, and they transfer these costs to the customers using their services. If price per token is a good indicator of energy use, which it seems to be, the fact that the cost per token has been sharply dropping is coherent with more recent analysis pointing to lower inference costs. Since it costs users much less than 1 cent to generate a page of text, this suggests that the total monetary cost of both the water and the energy for such a request cannot exceed 1 cent. It is in fact much smaller for most models. ↩︎

  14. It’s a moot point anyways, since Google now serves AI overviews powered by Gemini for most searches. ↩︎

  15. Triple this rate if you are streaming 4k video. ↩︎

  16. Examples include Cursor, Windsurf, Github Copilot and Claude Code. ↩︎

  17. We include here all other chatbots served through a dedicated interface, such as Claude, Gemini, Grok, etc. ↩︎

  18. We concentrate on US data, but efficiency may vary substantially throughout the world, with e.g France producing 5-10x less carbon per Wh than the rest of the world. ↩︎

  19. The US has about half of worldwide datacenters (including hyperscalers), and these datacenters are used worldwide, so this number over-represents the impact of American citizens. ↩︎

  20. The total yearly US electricity consumption has been hovering around 4 * 10^15 Wh for over a decade. ↩︎

  21. This number is expected to double by 2028. ↩︎

  22. Alex de Vries estimated 2.5% in 2023, Uptime Institute estimated 2% in Q1 2024, and the Q4 2024 report by the LBNL cited above found ~25%. ↩︎

  23. Historically, the majority of datacenters were powered by renewable sources. However, renewable energy sources have long lead times, which has led some operators to switch to gas power plants, which can be available quickly, when building new datacenters to accommodate the rapid building of new datacenters for AI. One review found that in the US, the carbon intensity of electricity used in datacenters was 48% higher than the national average. ↩︎

  24. Simultaneously, the increasing prevalence of smaller distilled models could drive the cost of inference down. ↩︎



AISafety.info

We’re a global team of specialists and volunteers from various backgrounds who want to ensure that the effects of future AI are beneficial rather than catastrophic.

© AISafety.info, 2022—1970

Aisafety.info is an Ashgro Inc Project. Ashgro Inc (EIN: 88-4232889) is a 501(c)(3) Public Charity incorporated in Delaware.