Sign In

Communications of the ACM

ACM News

Artificial Intelligence Is Driving A Silicon Renaissance


View as: Print Mobile App Share: Send by email Share on reddit Share on StumbleUpon Share on Hacker News Share on Tweeter Share on Facebook
Startup Cerebras Systems recently unveiled the largest computer chip in history, created for artificial intelligence.

Artificial intelligence has ushered in a new golden age of semiconductor innovation.

Credit: Jessica Chou/The New York Times

The semiconductor is the foundational technology of the digital age. It gave Silicon Valley its name. It sits at the heart of the computing revolution that has transformed every facet of society over the past half-century.

The pace of improvement in computing capabilities has been breathtaking and relentless since Intel introduced the world's first microprocessor in 1971. In line with Moore's Law, computer chips today are many millions of times more powerful than they were fifty years ago.

Yet while processing power has skyrocketed over the decades, the basic architecture of the computer chip has until recently remained largely static. For the most part, innovation in silicon has entailed further miniaturizing transistors in order to squeeze more of them onto integrated circuits. Companies like Intel and AMD have thrived for decades by reliably improving CPU capabilities in a process that Clayton Christensen would identify as "sustaining innovation".

Today, this is changing in dramatic fashion. AI has ushered in a new golden age of semiconductor innovation. The unique demands and limitless opportunities of machine learning have, for the first time in decades, spurred entrepreneurs to revisit and rethink even the most fundamental tenets of chip architecture.

 

From Forbes
View Full Article

 


 

No entries found