Sign In

Communications of the ACM

ACM TechNews

AI's Carbon Footprint Is Big, But Easy to Reduce, Google Researchers Say

View as: Print Mobile App Share: Send by email Share on reddit Share on StumbleUpon Share on Hacker News Share on Tweeter Share on Facebook
Active smoke stacks on nuclear power plants.

Research from scientists at the University of California at Berkeley and Google considers the climate impact of artificial intelligence, and how it can be mitigated.

Credit: Getty Images

Researchers at the University of California, Berkeley and Google have released the most accurate estimates to date for the carbon footprint of large artificial intelligence (AI) systems.

They determined OpenAI's powerful language model GPT-3, for example, produced the equivalent of 552 metric tons of carbon dioxide during its training.

The researchers found the carbon footprint of training AI algorithms depends on their design, the computer hardware used to train them, and the nature of electric power generation in the location where the training occurs; changing all three factors could lower that carbon footprint by a factor of up to 1,000.

A reduction by a factor of 10 could be achieved through the use of "sparse" neural network algorithms, in which most of the artificial neurons are connected to relatively few other neurons.

From Fortune
View Full Article


Abstracts Copyright © 2021 SmithBucklin, Washington, DC, USA


No entries found