Sign In

Communications of the ACM

ACM TechNews

Computing Power Needed to Train AI Is Rising Seven Times Faster Than Before


View as: Print Mobile App Share: Send by email Share on reddit Share on StumbleUpon Share on Hacker News Share on Tweeter Share on Facebook
microprocessor, illustration

In 2018, OpenAI—a for-profit artificial intelligence research Lab—found that the amount of computational power used to train the largest AI models had doubled every 3.4 months since 2012. Now, OpenAI has added new data to its analysis, showing how the post-2012 doubling compares to the historic doubling time since the beginning of the field.

From 1959 to 2012, the amount of power required doubled every two years, following the pace of Moore's Law; this means the doubling time today is more than seven times the previous rate.

Researchers are raising the issue of the ever increasing costs of deep learning. In June, researchers at the University of Massachusetts, Amherst showed how these increasing computational costs directly translate to carbon emissions.

Researchers at OpenAI suggest policymakers increase funding to academic researchers to close the resource gap between academic and industry labs.

From Technology Review
View Full Article

 

Abstracts Copyright © 2019 SmithBucklin, Washington, DC, USA


 

No entries found