Generative AI is making headlines and changing fields from science to the arts. However, there is a growing concern about its environmental impact. Training and running models like ChatGPT uses a lot of electricity and water. Learning about these effects is important if we want AI to be more sustainable.
This post looks at the main environmental impacts of generative AI. We will discuss why it uses so many resources, including the energy needs of data centers, the water used for cooling, and the carbon footprint from making special hardware. By understanding these issues, we can start an important conversation about balancing new technology with caring for the environment.
The Thirst for Power: AI and Data Centers
At the heart of generative AI's environmental footprint are data centers—vast facilities that house the servers, storage drives, and networking equipment required to train and run these Data centers are central to generative AI’s environmental impact. These large facilities hold the servers, storage, and networking equipment needed to train and run complex models. Although data centers have been around for years, generative AI has made their energy use much higher.st computing, but a generative AI training cluster might consume seven or eight times more energy than a typical computing workload."
The increase in demand is dramatic. Researchers say that data centers in North America more than doubled their power use in just one year, from 2,688 megawatts in late 2022 to 5,341 megawatts by the end of 2023, mostly because of AI. (Guidi et al., 2024) Worldwide, data centers used 460 terawatt-hours of electricity in 2022. If data centers were a country, they would be the 11th largest electricity user, between Saudi Arabia and France. This number could more than double by 2026, reaching about 1,050 terawatt-hours. (Data Centers Burned More Power in 2022 Than 185 of the World’s 195 Countries., 2024)
The problem is that this rapid expansion often outpaces the development of renewable energy infrastructure. Bashir notes, "The demand for new data centers cannot be met in a sustainable way. The pace at which companies are building new data centers means the bulk of the electricity to power them must come from fossil fuel-based power plants."
Training and Inference: A Two-Part Problem
The energy consumption of generative AI can be divided into two primary phases: training and inference.
Training: Teaching a large language model (LLM) uses a lot of energy. A 2021 study from Google and UC Berkeley found that training GPT-3 used 1,287 megawatt-hours of electricity. That could power about 120 average U.S. homes for a year and created around 552 tons of carbon dioxide. (Technology, 2025) Because the AI field moves quickly, new and bigger models are released often, making earlier training—and the energy used—outdated.
Inference: The environmental impact continues after a model is trained. Each time someone uses an AI, such as asking ChatGPT to summarize an email, it uses energy. This is called inference. Researchers estimate that one ChatGPT query uses about five times more electricity than a regular Google search. (ChatGPT's power consumption: ten times more than Google's, 2024) While this may seem small for one person, the total from millions of users adds up quickly. As these models become more common in daily life, the energy used for inference may soon be even greater than for training.
The Water Footprint of the Cloud
Beyond electricity, data centers have a significant water footprint. The powerful hardware used for AI generates immense heat, requiring constant cooling to operate efficiently. Most data centers rely on water-based cooling systems, where chilled water absorbs heat from the servers.
For every kilowatt-hour of energy a data center uses, it can require up to two liters of water for cooling. (Service, n.d.) This places a considerable strain on local water supplies, particularly in regions that may already be facing water scarcity. The term "cloud computing" can be misleading; these physical data centers have real-world consequences for local ecosystems and communities. As Noman Bashir points out, "Just because this is called 'cloud computing' doesn't mean the hardware lives in the cloud. Data centers are present in our physical world, and because of their water usage, they have direct and indirect implications for biodiversity."
The Embodied Carbon of AI Hardware
Generative AI’s environmental impact also comes from making its special hardware. GPUs, or Graphics Processing Units, are powerful chips needed for AI’s heavy tasks. Making these chips uses more energy than making regular CPUs. (Hu et al., 2025)
The carbon footprint grows even more because of how the raw materials for these parts are sourced. Mining for these materials can harm local environments and often uses toxic chemicals. (A social-environmental impact perspective of generative artificial intelligence, 2022, pp. 15512-15522)
Demand for this hardware is growing fast. In 2023, the top three makers shipped about 3.85 million GPUs to data centers, up from 2.67 million in 2022. (Super Micro shares surge as AI boom drives 100,000 quarterly GPU shipments, 2024) As AI keeps expanding, the carbon from making and moving this hardware will become a bigger part of its environmental impact.
Frequently Asked Questions
Why is generative AI so much more energy-intensive than other computing?
Generative AI models, especially large language models, have billions of parameters that must be processed at the same time. This needs very powerful hardware, such as GPUs, working together, and uses much more power than regular computing tasks like browsing the web or running business software.
Is anyone working on making AI more sustainable?
Yes. Researchers and companies are working on ways to make AI more sustainable. They are designing AI models that use less energy, finding better ways to cool data centers, and using renewable energy to power them. This is a fast-growing area of research.
How can AI users reduce their environmental impact?
While much of the responsibility lies with the companies developing and deploying AI, users can be more mindful of their usage. Opting for less complex models for simple tasks, limiting unnecessary queries, and supporting companies that are transparent about their energy consumption and sustainability efforts can contribute to a more responsible AI ecosystem.
Building a Sustainable AI Future
Generative AI is advancing quickly, but this creates a challenge. The technology has great potential, but its current path is not good for the environment. The high use of electricity and water, along with the carbon from its hardware, needs urgent action from developers, policymakers, and users.
To make progress, we need a broad approach. Tech companies should be more open about how much energy and water their models use. We also need to support research for better algorithms and hardware. As Elsa A. Olivetti, a professor at MIT, says, "We need a more contextual way of systematically and comprehensively understanding the implications of new developments in this space."
By recognizing these hidden costs and working together on solutions, we can guide generative AI to be both innovative and responsible. The aim is to use AI to solve big problems without causing new environmental issues.
%20(4).png)