Environmental Impact of AI
Prepared by Rick Craig, Director of Information Technology
College of Architecture, Arts, and Design | Virginia Tech
March 2026
This document is a resource for the AAD community in response to questions about the environmental cost of AI tools. The AAD AI Working Group is also looking at these questions as part of broader AI guidance for the college.
Why This Document Exists
Section titled “Why This Document Exists”After I shared information about VT’s free AI tools with the college, a colleague asked a fair question: what about the environmental impact? It deserved a real answer. So I pulled together what the research actually says — the good, the bad, and the parts that are still uncertain.
This is not a position paper. It summarizes what credible sources report about the energy, carbon, and water costs of AI so that faculty and staff can make informed decisions. Information is current as of March 2026 and will be updated as things change.
The Big Distinction: Training vs. Using
Section titled “The Big Distinction: Training vs. Using”When people talk about AI’s energy footprint, they usually conflate two different things: training a model and using one.
Training is building the model. It involves running thousands of specialized processors (GPUs) for weeks or months on enormous datasets. The most transparent accounting we have comes from BLOOM, a 176-billion-parameter open-source model whose carbon footprint was documented by Sasha Luccioni and colleagues at Hugging Face (Luccioni et al., “Estimating the Carbon Footprint of BLOOM,” JMLR, 2023). BLOOM is the benchmark here because it was open-source with full disclosure — something no commercial frontier lab has matched.
For larger models like GPT-4, we are working with estimates. Third-party analyses range widely. Greenly modeled one scenario at roughly 7,100 metric tons of CO2 (Greenly, 2025), while other analyses using different assumptions put the figure as high as 15,000 metric tons. These are modeled estimates based on publicly available information, not measured training emissions — OpenAI has not disclosed actual training energy data. For perspective, the EPA estimates a typical passenger vehicle emits about 4.6 metric tons of CO2 per year, so even the lower estimate equals the annual emissions of roughly 1,540 cars (EPA, “Greenhouse Gas Emissions from a Typical Passenger Vehicle,” 2024). OpenAI has not published official training energy numbers, and none of these third-party estimates are independently verified. Treat them as order-of-magnitude estimates, not confirmed facts.
Emma Strubell and colleagues at UMass Amherst were among the first to quantify this. Their 2019 paper found that training a single large NLP model with neural architecture search could emit as much CO2 as five cars over their entire lifetimes — roughly 284 tonnes (Strubell et al., “Energy and Policy Considerations for Deep Learning in NLP,” ACL 2019). Those were 2019-era models. Frontier models today are orders of magnitude larger.
Patterson et al. at Google argued in 2022 that training’s carbon footprint would “plateau, then shrink” as hardware, model architecture, and data center location improve. They showed the right combination could reduce carbon by 100-1000x (Patterson et al., “The Carbon Footprint of Machine Learning Training Will Plateau, Then Shrink,” arXiv, 2022). That prediction has not materialized at the frontier — model scale has outpaced efficiency gains — but the point about geographic and architectural choices stands.
Using a model (called “inference”) is a much smaller draw per interaction. Google disclosed in August 2025 that a median Gemini prompt consumes 0.24 watt-hours of electricity with emissions of 0.03 grams of CO2 — the first time any major AI company released per-query energy data (MIT Technology Review, August 2025). For comparison, Google reported in 2009 that a search uses approximately 0.3 watt-hours, though current searches likely use less due to efficiency improvements (Google, 2009). Researchers estimated a ChatGPT query on GPT-4o at about 0.42 watt-hours for a short prompt (Jegham et al., “How Hungry is AI?”, arXiv, May 2025). Luccioni et al. measured BLOOM-176B at an average of 4 watt-hours per query across over 230,000 requests (ACM FAccT, 2024).
The takeaway: a single AI query uses energy comparable to or up to roughly 13 times that of a web search, depending on the model, the task, and the conversation length. Google’s own data shows a median Gemini prompt actually uses less energy than Google’s 2009 search figure (0.24 Wh vs. 0.3 Wh). Image and video generation sit at the high end; simple text questions sit closer to — or below — a regular search.
But here is the part that changes the conversation: training is a one-time cost per model. Inference happens billions of times a day across all users. The IEA notes that servers account for roughly 60% of data center electricity broadly, and as AI adoption scales, inference is becoming the dominant share of AI-specific energy use (IEA, “Energy and AI,” 2025). Deloitte’s 2026 technology predictions project inference will account for approximately two-thirds of AI compute by 2026. The cumulative weight of all those individual queries adds up fast.
Data Centers: Energy, Carbon, and Water
Section titled “Data Centers: Energy, Carbon, and Water”AI runs in data centers, and data centers use a lot of electricity. The International Energy Agency estimated global data center electricity consumption at approximately 415 terawatt-hours in 2024 — about 1.5% of global electricity. Their base-case projection puts that at 945 TWh by 2030, or roughly 3% of global electricity (IEA, “Energy and AI,” 2025). Goldman Sachs projects data center power demand surging 165-175% by 2030 compared to 2023 levels, which they describe as “the equivalent of adding another top-10 power-consuming country” (Goldman Sachs Research, 2024).
In the US specifically, data centers consumed about 4% of the nation’s electricity in 2023. Goldman Sachs projects that more than doubling to over 8% by 2030. Lawrence Berkeley National Laboratory projects that AI could account for a significant and growing share of data center electricity by the end of the decade, with some projections suggesting AI workloads could reach 35-50% of data center power by 2030 (IEA, 2025; MIT Technology Review, 2025).
Where the power comes from matters
Section titled “Where the power comes from matters”The carbon footprint of a data center depends heavily on the local electricity grid. Patterson et al. (2021) found that the same AI workload can produce 5 to 10 times more CO2 depending on geographic location, even within the same country and organization.
The major cloud providers have made clean energy commitments, but the picture is complicated:
Google reported in its 2025 Environmental Report that overall emissions are up 51% compared to its 2019 baseline. Data center electricity demand grew 27% in 2024 alone. However, Google contracted 8 GW of new clean energy in 2024 (a company record) and reports a 12% decrease in data center energy emissions through clean energy procurement. The caveat: this reduction relies partly on renewable energy certificate (REC) accounting.
Microsoft reported a 23.4% increase in overall emissions compared to its 2020 baseline. Its operational emissions (Scope 1 and 2) decreased 29.9%, but supply chain emissions (Scope 3) — which constitute 97% of its total footprint — increased 26%. Microsoft’s goal of being carbon negative by 2030 is not currently on track for total emissions (Microsoft, 2025 Environmental Sustainability Report).
Data centers require substantial water for cooling. Lawrence Berkeley National Laboratory estimated that US data centers consumed 17 billion gallons of water in 2023, with projections that this could increase significantly by 2028 (LBNL, 2024). The Brookings Institution reports that a typical data center uses 300,000 gallons per day; large facilities can use up to 5 million gallons per day, equivalent to the water use of a town of 50,000 residents (Brookings, 2025). Google’s 2025 Environmental Report disclosed that water consumption reached approximately 8.1 billion gallons in 2024, though water replenishment increased from 18% to 64% of consumption.
Putting It in Context
Section titled “Putting It in Context”These numbers are real. But they need context.
AI vs. web search: An AI query uses roughly 1-13x the energy of a web search, depending on the task. The median Gemini query (0.24 Wh) is actually comparable to a Google search (0.3 Wh). More complex generative tasks cost significantly more.
AI vs. cryptocurrency: The Cambridge Bitcoin Electricity Consumption Index estimates Bitcoin mining in the range of 138-180 TWh annually as of 2025, depending on the methodology and time period — Cambridge’s own 2025 Digital Mining Industry Report puts it at 138 TWh, while third-party estimates using different assumptions range higher (80-390 TWh range across all sources). AI data center energy consumption is growing rapidly and IEA projections suggest it could approach or exceed cryptocurrency mining within the next few years.
Overall share: Data centers currently consume about 1.5% of global electricity (IEA, 2024). Even under aggressive growth projections, that rises to about 3% by 2030. Data center emissions are projected to reach roughly 1% of global CO2 by 2030 under the IEA’s central scenario, or 1.4% in a faster-growth scenario (Carbon Brief, September 2025).
That is not nothing. But it is not the dominant driver of global emissions. For comparison, transportation accounts for roughly 28% of US greenhouse gas emissions, industry about 23%, and commercial/residential buildings about 13% (EPA, “Sources of Greenhouse Gas Emissions,” 2024). Data centers at 1.5% globally are a real and growing contributor, but the concern is the trajectory, not today’s number.
There is also a “net impact” argument. The IEA’s 2025 report and Google’s 2025 Environmental Report both note that AI could reduce emissions in other sectors — optimizing power grids, accelerating materials science for better batteries, improving weather prediction and climate modeling. Whether those savings offset AI’s own footprint is an open question, but it is part of the picture.
The Growth Problem
Section titled “The Growth Problem”This is the part that does not have an easy answer.
In economics, Jevons paradox describes how efficiency improvements in using a resource can lead to increased total consumption, because the improvements make the resource cheaper and more accessible. AI appears to be following this pattern.
When DeepSeek demonstrated dramatically lower training costs in early 2025, Microsoft CEO Satya Nadella explicitly cited Jevons paradox: “As AI gets more efficient and accessible, we will see its use skyrocket, turning it into a commodity we just can’t get enough of” (NPR, February 2025).
The data supports this. Google’s emissions are up 51% since 2019. Microsoft’s are up 23.4% since 2020. These increases are happening despite significant efficiency improvements in both hardware and software.
Epoch AI reports that training compute for frontier models has grown at 4.5x per year since 2010, and the power required to train the largest models is growing by more than 2x per year, on trend to reach multiple gigawatts by 2030.
What Responsible Usage Looks Like
Section titled “What Responsible Usage Looks Like”None of this means we should stop using AI tools. It means we should use them thoughtfully.
A caveat: the aggregate environmental trajectory of AI is driven by corporate decisions about infrastructure buildout, not by whether individual users pick one model over another. The suggestions below are worth following, but they operate at the margins. The real levers are in the hands of the companies building the data centers and the policies that govern them. That said, informed use is better than uninformed use, and understanding the landscape helps us ask better questions as an institution.
Choose the right tool for the task. Luccioni et al. (2024) found that task-specific models use dramatically less energy than general-purpose generative models. If a simpler tool does the job, use the simpler tool.
Be aware of task intensity. Text classification and summarization are far less energy-intensive than image or video generation. A quick question to Gemini costs a fraction of what generating a set of images does.
Consider model size. The spread between the most efficient and least efficient models for a given task ranges from 5x to 50x, depending on the task type (Luccioni et al., 2024).
Push for transparency. Google’s August 2025 disclosure of per-query energy data was a first. The more users and institutions ask for this information, the more likely providers are to publish it.
The Honest Summary
Section titled “The Honest Summary”AI tools have a real environmental footprint. Training frontier models is energy-intensive. Inference at scale is growing fast. Data centers use significant electricity and water, and the major providers are not yet meeting their own sustainability commitments.
At the same time, the per-query cost of using AI is comparable to or modestly higher than activities we already do billions of times a day, like web searches and streaming video. The concern is less about any one person’s use and more about the aggregate trajectory — scale of adoption multiplied by an efficiency paradox that drives more consumption, not less.
Virginia Tech’s decision to provide these tools carries institutional environmental responsibility alongside the educational benefits. That tension is worth holding honestly rather than resolving prematurely in either direction. The AAD AI Working Group is looking at these questions as part of a broader conversation about AI in our college. This document is meant to give everyone a factual foundation for that conversation.
Works Cited
Section titled “Works Cited”- Brookings Institution. “AI, Data Centers, and Water.” 2025.
- Carbon Brief. “AI: Five Charts That Put Data Centre Energy Use and Emissions into Context.” September 2025.
- Epoch AI. “Training Compute of Frontier AI Models Grows by 4.5x per Year.”
- Goldman Sachs Research. “AI to Drive 165% Increase in Data Center Power Demand by 2030.” 2024.
- Google. “Google 2025 Environmental Report.”
- Greenly. “The Environmental Impact of Artificial Intelligence.” Updated February 2025.
- International Energy Agency (IEA). “Energy and AI: Energy Demand from AI.” 2025.
- Luccioni, A.S., Viguier, S., and Ligozat, A.-L. “Estimating the Carbon Footprint of BLOOM, a 176B Parameter Language Model.” JMLR, Volume 24, 2023.
- Luccioni, A.S., Jernite, Y., and Strubell, E. “Power Hungry Processing: Watts Driving the Cost of AI Deployment?” ACM FAccT, 2024.
- Microsoft. “2025 Environmental Sustainability Report.”
- MIT Technology Review. “In a First, Google Has Released Data on How Much Energy an AI Prompt Uses.” August 2025.
- Jegham, I. et al. “How Hungry is AI?” arXiv, May 2025.
- Deloitte. “Technology, Media, and Telecommunications Predictions 2026.”
- NPR. “AI, DeepSeek, Economics, Jevons Paradox.” February 2025.
- Patterson, D. et al. “Carbon Emissions and Large Neural Network Training.” arXiv, 2021.
- Patterson, D. et al. “The Carbon Footprint of Machine Learning Training Will Plateau, Then Shrink.” arXiv, 2022.
- Schwartz, R., Dodge, J., Smith, N.A., and Etzioni, O. “Green AI.” Communications of the ACM, Volume 63, pp. 54-63, 2020.
- Strubell, E., Ganesh, A., and McCallum, A. “Energy and Policy Considerations for Deep Learning in NLP.” ACL 2019.