Sign In

Communications of the ACM

ACM TechNews

The Cost of Training Machines Is Becoming a Problem

View as: Print Mobile App Share:
Artist's conception of training AI on a chip.

For many comparatively simple artificial intelligence applications, the cost of training a computer is falling, but that is not true everywhere.

Credit: Tom Gauld

The assumption that the cost of training computers is declining with the doubling of computing power every two years does not always hold true.

The OpenAI research firm said accelerating demand caused power requirements for training large models to skyrocket 300,000-fold by 2018, and they now double every 3.5 months.

Facebook's Jerome Pesenti said one round of training for the biggest models can cost "millions of dollars" in electrical power.

Increasing demand for computing power has fueled an explosion in processor design and specialized devices that can efficiently perform artificial intelligence (AI) calculations.

With Moore's Law approaching its physical limits, scientists are pursuing alternate approaches for boosting power, like quantum and neuromorphic computing.

AI researchers will have to extract as much performance as possible from existing technologies for the time being, but some expect  specialized hardware and modified software to improve processing speeds.

From The Economist
View Full Article


Abstracts Copyright © 2020 SmithBucklin, Washington, DC, USA


No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account