Researchers at the University of California, Berkeley and Google have released the most accurate estimates to date for the carbon footprint of large artificial intelligence (AI) systems.
They determined OpenAI's powerful language model GPT-3, for example, produced the equivalent of 552 metric tons of carbon dioxide during its training.
The researchers found the carbon footprint of training AI algorithms depends on their design, the computer hardware used to train them, and the nature of electric power generation in the location where the training occurs; changing all three factors could lower that carbon footprint by a factor of up to 1,000.
A reduction by a factor of 10 could be achieved through the use of "sparse" neural network algorithms, in which most of the artificial neurons are connected to relatively few other neurons.
View Full Article
Abstracts Copyright © 2021 SmithBucklin, Washington, DC, USA
No entries found