Artificial Intelligence and Machine Learning Editor's letter

GenAI: Giga$$$, TeraWatt-Hours, and GigaTons of CO2

Andrew A. Chien, past Editor-in-Chief of CACM

For more than a decade, we have speculated about the impact of artificial intelligence (AI)/machine learning (ML) on the environmental sustainability of computing (see ACM2). It has become clear that Al’s carbon emissions (scope 2), lifecycle carbon (scope 3), and other negative environmental impacts are growing explosively. Generative AI capabilities and applications exemplified and popularized in ChatGPT, DALL-E 2, Stable Diffusion, and Copilot, are the drivers. The evidence:

Giga$$$s of increased spending on AI computing equipment and infrastructure is driving a dramatic increase in infrastructure: AI computing silicon and datacenters.

Nvidia. From May 2022 to April 2023 (12 months), Nvidia’s datacenter group sold $15.5B of GPUs. During a May 24 conference call earlier this year, the firm doubled guidance for the next quarter, increasing quarterly datacenter GPU sales forecasts from $4B to $8B (to an annual rate of $32B).3 Dramatic growth in GPU demand for generative AI was the cited reason.

Amazon, Microsoft, and Google. The growth of GPUs for AI comes on top of 12% per year cloud growth and is reflected in large capital investments in datacenters (CPUs, networks, cooling, power, buildings, and so on). The three largest hyperscalers (Amazon, Microsoft, and Google) have reported large increases in datacenter CapEx investment from $78B in 2022 to $120B in 2023, a 54% increase.8

Evidence of excess demand. Endemic reports of “GPU shortages” indicate the cloud cannot satisfy the compute demands of well-funded AI developers in both new ventures and existing companies.

Terawatt-Hours (TWh) of increased datacenter power consumption will result from the growing investment in AI hardware. The most accurate retrospective estimates of datacenter power consumption are based on worldwide computing equipment sales: processors, networking, and storage. These backward-looking studies outlined the rapid growth of hyperscalers as the key driver of datacenter power consumption (30% per year).5 We apply similar methodology to these massive GPU investments.

Estimating the annual power consumption of $32B of GPUs involves GPU prices, operational parameters, and associated server and datacenter power. Using A100 prices ($10k) and 300W average incremental load (GPU at 50% of TDP, server balance of 50%, and PUE of 1.0: 3.2M GPUs → 960MW → 8.4 TWh/year.

To put this into perspective, 8.4 TWh/year was 54% of Google’s total datacenter power in 2020. Nvidia’s sales-rate increase alone corresponds to 27% of Google’s total 2020 fleet. A more complete comparison is difficult as several hyperscalers have stopped disclosing total power consumption due to growing public outcry.7 8.4 TWh corresponds to 0.25% of the USA’s annual electric power consumption.

Gigatons of CO2 emissions will result from the growth in datacenter power consumption and perhaps threaten the reliability of the power grid. Datacenter carbon emissions should be computed based on the average grid emissions at the time power is consumed or worse, since some studies suggest that data-center loads harm grid decarbonization.4 Using 2021 U.S. average grid emissions, 8.4 TWh is 3.25 gigatons of CO2, the equivalent of five billion U.S. crosscountry flights. In high-penetration areas, cloud datacenters already exceed 20% of grid power; their further growth is being limited by grid reliability risks.

Back to Top


This analysis is based on economic data, and the conclusion of major growth was validated by industry leaders at Google’s recent “Future of Data Center Workshop” (June 1, 2023), where I served as a panelist. The growth analyzed is just one year. With the boom in generative AI, the situation is now far different than recent reports.6

The growing technical power and economic drivers make it improbable the AI juggernaut can be stopped. This means the sustainability of computing/AI must be a concern for all computing researchers.1


    1. Fostering Responsible Computing Research: Foundations and Practices. The National Academies Press, Washington, DC (2022).

    2. Knowles, B. ACM Technology Policy Council TechBrief: Computing and Climate Change. ACM (2021).

    3. Leswing, K. Nvidia shares spike 26% on huge forecast beat driven by A.I. chip demand. CNBC (May 24, 2023).

    4. Lin, L., Zavala, V., and Chien, A.A. Evaluating coupling models for cloud datacenters and power grids. In Proceedings of the 12th ACM Intern. Conf. on Future Energy Systems (2021).

    5. Masanet, E. et al. Recalibrating global data center energy-use estimates. Science 367, 6481 (Feb. 2020).

    6. Patterson, D. et. al. The carbon footprint of machine learning training will plateau, then shrink. Computer 55, 7 (July 2022).

    7. Synek, G. Amazon is one of the largest consumers of electricity but is offloading costs onto others. Techspot (Aug. 2018).

    8. There's AI in them thar hills. The Economist (May 29, 2023).

Join the Discussion (1)

Become a Member or Sign In to Post a Comment

  1. I have tried to understand the calculation. It fits up to the 8.4 TWh/year. But with average grid emissions of 0.39 kg CO2/kWh, I only come up with 3.25 megatons of CO2 (instead of gigatons). That’s still a lot, but then it would only be 5 million flights

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More