Sign In

Communications of the ACM

Editor's letter

GenAI: Giga$$$, TeraWatt-Hours, and GigaTons of CO2

Andrew A. Chien, past Editor-in-Chief of CACM

For more than a decade, we have speculated about the impact of artificial intelligence (AI)/machine learning (ML) on the environmental sustainability of computing (see ACM2). It has become clear that Al's carbon emissions (scope 2), lifecycle carbon (scope 3), and other negative environmental impacts are growing explosively. Generative AI capabilities and applications exemplified and popularized in ChatGPT, DALL-E 2, Stable Diffusion, and Copilot, are the drivers. The evidence:

Giga$$$s of increased spending on AI computing equipment and infrastructure is driving a dramatic increase in infrastructure: AI computing silicon and datacenters.

Nvidia. From May 2022 to April 2023 (12 months), Nvidia's datacenter group sold $15.5B of GPUs. During a May 24 conference call earlier this year, the firm doubled guidance for the next quarter, increasing quarterly datacenter GPU sales forecasts from $4B to $8B (to an annual rate of $32B).3 Dramatic growth in GPU demand for generative AI was the cited reason.

Amazon, Microsoft, and Google. The growth of GPUs for AI comes on top of 12% per year cloud growth and is reflected in large capital investments in datacenters (CPUs, networks, cooling, power, buildings, and so on). The three largest hyperscalers (Amazon, Microsoft, and Google) have reported large increases in datacenter CapEx investment from $78B in 2022 to $120B in 2023, a 54% increase.8

Evidence of excess demand. Endemic reports of "GPU shortages" indicate the cloud cannot satisfy the compute demands of well-funded AI developers in both new ventures and existing companies.

Terawatt-Hours (TWh) of increased datacenter power consumption will result from the growing investment in AI hardware. The most accurate retrospective estimates of datacenter power consumption are based on worldwide computing equipment sales: processors, networking, and storage. These backward-looking studies outlined the rapid growth of hyperscalers as the key driver of datacenter power consumption (30% per year).5 We apply similar methodology to these massive GPU investments.

Estimating the annual power consumption of $32B of GPUs involves GPU prices, operational parameters, and associated server and datacenter power. Using A100 prices ($10k) and 300W average incremental load (GPU at 50% of TDP, server balance of 50%, and PUE of 1.0: 3.2M GPUs → 960MW → 8.4 TWh/year.

To put this into perspective, 8.4 TWh/year was 54% of Google's total datacenter power in 2020. Nvidia's sales-rate increase alone corresponds to 27% of Google's total 2020 fleet. A more complete comparison is difficult as several hyperscalers have stopped disclosing total power consumption due to growing public outcry.7 8.4 TWh corresponds to 0.25% of the USA's annual electric power consumption.

Gigatons of CO2 emissions will result from the growth in datacenter power consumption and perhaps threaten the reliability of the power grid. Datacenter carbon emissions should be computed based on the average grid emissions at the time power is consumed or worse, since some studies suggest that data-center loads harm grid decarbonization.4 Using 2021 U.S. average grid emissions, 8.4 TWh is 3.25 gigatons of CO2, the equivalent of five billion U.S. crosscountry flights. In high-penetration areas, cloud datacenters already exceed 20% of grid power; their further growth is being limited by grid reliability risks.

Back to Top


This analysis is based on economic data, and the conclusion of major growth was validated by industry leaders at Google's recent "Future of Data Center Workshop" (June 1, 2023), where I served as a panelist. The growth analyzed is just one year. With the boom in generative AI, the situation is now far different than recent reports.6

The growing technical power and economic drivers make it improbable the AI juggernaut can be stopped. This means the sustainability of computing/AI must be a concern for all computing researchers.1

Back to Top


1. Fostering Responsible Computing Research: Foundations and Practices. The National Academies Press, Washington, DC (2022).

2. Knowles, B. ACM Technology Policy Council TechBrief: Computing and Climate Change. ACM (2021).

3. Leswing, K. Nvidia shares spike 26% on huge forecast beat driven by A.I. chip demand. CNBC (May 24, 2023).

4. Lin, L., Zavala, V., and Chien, A.A. Evaluating coupling models for cloud datacenters and power grids. In Proceedings of the 12th ACM Intern. Conf. on Future Energy Systems (2021).

5. Masanet, E. et al. Recalibrating global data center energy-use estimates. Science 367, 6481 (Feb. 2020).

6. Patterson, D. et. al. The carbon footprint of machine learning training will plateau, then shrink. Computer 55, 7 (July 2022).

7. Synek, G. Amazon is one of the largest consumers of electricity but is offloading costs onto others. Techspot (Aug. 2018).

8. There's AI in them thar hills. The Economist (May 29, 2023).

Back to Top


Andrew A. Chien is the William Eckhardt Distinguished Service Professor in the Department of Computer Science at the University of Chicago, Director of the CERES Center for Unstoppable Computing, and a Senior Scientist at Argonne National Laboratory. He leads the Zero-carbon Cloud project, and is a former Editor-in-Chief of Communications.

Copyright held by author/owner
Request permission to (re)publish from the owner/author

The Digital Library is published by the Association for Computing Machinery. Copyright © 2023 ACM, Inc.


Andrew Chien

On August 23, 2023, NVIDIA reported quarterly datacenter revenue at over $10B. This would reflect a 25% higher number than used for estimation in this article. Updating would produce a 1,200MW and 10.5 TWh and over 4 Gigatons of CO2.

At the same earnings call, NVIDIA guided for more than $12B in the upcoming quarter. This would correspond to 1,440MW, 12.5TWh, and over 5 Gigatons of CO2.

Analysts have quoted NVIDIA's AI chip market share at 70%, so allowing for the other vendors increase, this would produce estimates of 2,044MW, 17.85TWh, and over 7.1 Gigatons of CO2.

All of these extimates are based on next quarter estimates annualized. The AI datacenter market may well continue to grow, making these numbers an underestimate. This seems likely, as GPU vendors have indicated clear demand well into 2024.

The sustainability challenges for computing (and specifically AI) are growing rapidly!

Displaying 1 comment

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account
Article Contents: