acm-header
Sign In

Communications of the ACM

ACM News

Intel's Neuromorphic Chip Gets Major Upgrade


View as: Print Mobile App Share: Send by email Share on reddit Share on StumbleUpon Share on Hacker News Share on Tweeter Share on Facebook
Intel's new Loihi 2 neuromorphic chip.

Loihi 2's circuits that were "redesigned from the ground up" to address limitations found in the first Loihi chip, according to Mike Davies, director of Intel's neuromorphic computing lab.

Credit: Intel

Many AIs may depend on things called neural networks, but there's very little about them that works in the way human and animal brains do. Intel has been experimenting with computers that think more like a brain does for several years now, racking up some impressive if quirky results with their Loihi neuromorphic chip. Now Loihi is getting its first upgrade, and it's a pretty big one. Using a manufacturing process called Intel 4 that's not yet available for commercial chips, the company packed in up to eight-times as many artificial neurons into a chip that's half the area of Loihi. That, and a host of changes motivated by the past few years of experiments, make the Loihi 2 faster and more flexible, says Mike Davies director of Intel's neuromorphic computing lab.

Unlike the artificial neurons in conventional AI, which store information as weights that measure the strength of connection between neurons, Loihi's neurons carry information in the timing of digitally-represented spikes, which is more analogous to what goes on in your brain. Neural computation is triggered by these spikes, so there's no need for a central clock to keep things synchronous. And much of the chip will be idle when there is no event to observe, saving power.

Some 250 research partners have been using Loihi systems for things like controlling drones or robot arms, optimizing train schedules, searching databases, and learning to identify different odors. "The results have been quite encouraging," says Davies. Some energy efficiency gains were "orders of magnitude" and there were also gains in energy efficiency and the amount of data needed for the system to learn.

From IEEE Spectrum
View Full Article

 


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account