News
Architecture and Hardware News

Building a Brain May Mean Going Analog

Analog circuits consume less power per operation than CMOS technologies, and so should prove more efficient.
Posted
Building a Brain May Mean Going Analog, illustration
  1. Article
  2. Author
Building a Brain May Mean Going Analog, illustration

Digital supercomputing can be expensive and energy-hungry, yet still it struggles with problems that the human brain tackles easily, such as understanding speech or viewing a photograph and recognizing what it shows. Even though artificial neural networks that apply deep learning have made much headway over the last few years, some computer scientists think they can do better with systems that even more closely resemble a living brain. Such neuromorphic computing, as this brain emulation is known, might not only accomplish tasks that current computers cannot, it could also lead to a clearer understanding of how human memory and cognition work. Also, if researchers can figure out how to build the machines out of analog circuits, they could run them with a fraction of the energy needed by modern computers.

“The real driver for neuromorphic computing is energy efficiency, and the current design space on CMOS isn’t particularly energy efficient,” says Mark Stiles, a physicist who is a project leader in the Center for Nanoscale Science and Technology at the U.S. National Institutes for Standards and Technology (NIST) in Gaithersburg, MD. Analog circuits consume less power per operation than existing complementary metal oxide semiconductor (CMOS) technologies, and so should prove more efficient. On the other hand, analog circuits are vulnerable to noise, and the technologies for building them are not as advanced as those for CMOS chips.

One group working on such analog components is Hideo Ohno’s Laboratory for Nanoelectronics and Spintronics at Tohoku University in Sendai, Japan. They have built a device that could work as an artificial synapse by relying on spintronics, a quantum property of electrons that gives rise to magnetism. Their device consists of a strip of cobalt/nickel, which is ferromagnetic, meaning that its spins are all aligned. They cross it with a strip of platinum manganese, which is anti-ferromagnetic, so spins in successive atomic layers of the material are perpendicular to each other. Applying a current across the antiferromagnetic layer affects its spins, which in turn applies torque to the spins in the magnetic layer, switching the magnetization from up to down. Unlike in a digital system, the switching is not limited to a 0 or a 1, but can be some fraction that depends on how much current is applied.

A reading current, lower than the switching current, does not switch the magnetization, but its voltage depends on the level of magnetization. Sending multiple signals causes the spin device to adjust its own resistance, strengthening or weakening the connections between neurons, just as synapses in the brain do when they are forming memories, a process brain scientists call plasticity. “The more signals we send in this learning process—the more times we increase or decrease these weights depending on what we want it to understand—the better chance we have of it remembering it when we really want it to,” says William Borders, a Ph.D. student in Ohno’s lab who works on the project. Train the system to associate one pattern of magnetization with the letter C and another with the letter T, he said, and it will be able to recognize those letters again by measuring resistance.

Whereas Ohno’s group is simulating individual synapses, Stiles’ group is using spintronics to model collections of neurons and synapses. Their device consists of two ferromagnetic layers separated by a spacer. When a current flows across the junction, it creates a torque on the spins in the material, which sets up a nonlinear oscillation of the magnetization, which in turn creates an oscillation in voltage. Living neurons also behave as nonlinear oscillators, sending out electrical spikes and synchronizing with each other. The spintronic oscillators, therefore, emulate the activity in the brain.

Another approach to building analog circuits with synapses uses memristors, a recently discovered fourth fundamental circuit element, in which flowing current alters resistance, providing the device with memory. Dmitri Strukov, a professor of electrical and computer engineering at the University of California, Santa Barbara, is using memristors to create a multilayer-perceptron network with a simplified version of a synapse’s functionality. “Biological networks are much more complicated, in the sense that it is known for example that information is encoded in timing of the spikes, and the shapes of the spikes matter. Here we neglect all of that,” he says.


Memristors are promising because it should be possible to make them very small, and they can be stacked, allowing for the high density a neuromorphic system would require.


In the learning part of the process, a large voltage tunes the state of the memristors. The inference part, such as trying to match a new visual pattern with a learned one, uses a smaller voltage. Memristors are promising, Strukov says, because it should be possible to make them very small, and they can be easily stacked, allowing for the high density a neuromorphic system will require. On the other hand, the process for making them has not been refined to the point where it can reliably produce billions of working devices.

Strukov is also using flash memory for synapses. Because it is based on CMOS technology, it works reliably, but it would be hard to shrink the devices or to pack them more densely, so scaling up to a high-density network with billions of devices could be challenging.

A different way to build synapses relies on Josephson junctions, made by separating two pieces of super-conducting material with a thin layer of an insulating material. Stephen Russek, a physicist at NIST in Boulder, CO, puts magnetic nanoparticles of manganese in the insulating layer of amorphous silicon between two layers of niobium. Applying a pulse current to the system changes the magnetic structure in the insulating layer—the equivalent of synaptic memory—which affects how current flows from one superconductor to the other. At a certain point, the superconductors start sending out voltage spikes, just as neurons in the brain do. “It’s very analogous to what happens in a real neural system where you have a synapse, which is basically a gap between a dendrite and an axon, and that nano-structure in that gap determines the coupling and the spiking probability of that membrane,” Russek says.

The advantage of this approach over both CMOS and spintronics, Russek argues, is that the Josephson junction is a naturally spiking element, so it is already more similar to the brain than other devices. The NIST system also resembles a living brain in that signaling activity continues even when the machine is in a resting state. “Just like the brain, you don’t turn it off. It’s always doing stuff,” Russek says. He’d like to emulate that continuous activity, which may play a role in creativity. “We want it because that’s what the brain does and we think that’s an important part of learning.”

Karlheinz Meier, a physicist who headed up BrainScaleS (Brain-inspired multiscale computation in neuromorphic hybrid systems), a project that, until its end in 2015, aimed to develop a mixed-signal neuromorphic system as part of the European Union’s Human Brain Project, says neuromorphic computing will have to incorporate such random activity, and also come to grips with analog circuits’ susceptibility to noise. “We have to learn how to do what I call ‘dirty computing,’ how you live with non-perfect components,” he says.

So far, these analog systems are very small. Tohoku University, for instance, built a chip with 36 spintronic synapses. Strukov’s most recent memristor device holds the equivalent of approximately 400 synapses, while his flash-based system has 100,000. By contrast, the TrueNorth chip, IBM’s digital implementation of neural computing, simulates 100 million spiking neurons and 256 million synapses and consumes just 70mW, though Strukov argues that his flash chip is more efficient, with energy use and latency three orders of magnitude better than TrueNorth, when the IBM chip is configured to perform the same task. Even True-North is not close to the brain, which has about 100 billion neurons connected by perhaps 100 trillion synapses, operating on about 20W.

Going digital was important for TrueNorth, says Dharmendra Modha, chief scientist for IBM’s Brain Inspired Research group, which developed the chip. That’s because the learning that shapes the neural network is done on a separate system, and the network is then transferred to the chip where it can be run with low energy requirements. Doing it that way, Modha says, requires a digital approach because that helps guarantee one-to-one equivalence between the software and the hardware, which is harder to achieve with analog circuits that are prone to signal leakage.

In the short term, the digital approach allowed IBM to build a neuromorphic chip in the time-frame of a government-funded project, says Modha. The project allowed the company to experiment with new architectures that could yield practical applications soon, and may also lead the way to future systems that use analog circuits in materials other than silicon. “Our view is not analog versus digital,” Modha says. Instead, the company is pursuing various architectures, materials, and approaches to see which pan out and which prove useful on the way to the ultimate goal of a brain-like computer.

Similarly, the digital Spiking Neural Network Architecture (SpiNNaker) chip, a massively parallel neural network completed last year at the U.K.’s University of Manchester (UManchester) as part of the Human Brain Project, may find commercial applications for neuromorphic computing in a relatively short time. It is much more flexible than an analog set-up, where the network is configured in hardware, or TrueNorth, in which training of the network takes place off-chip, says Steve Furber, ICL Professor of Computer Engineering at the School of Computer Science at UManchester, who runs the project. “If you have a slightly wacky idea for a potential application for a spiking network, then the easiest machine to prototype it on would be SpiNNaker. When you knew exactly what you wanted, you’d probably want to go and reengineer it, either in this very efficient TrueNorth digital form, or possibly in an analog form,” he says.

Rick Stevens, professor of computing at the University of Chicago, says analog neuromorphic computing is still in its early days. “We don’t have any neuromorphic hardware that’s sufficiently interesting to make the case that it’s a plausible computing platform,” he says. “If you’re trying to do serious deep learning work, you’re going be running not on analog hardware.” Still, he says, it is worth pursuing, in the same way another far-off technology, quantum computing, is.

Russek agrees an analog neuromorphic computer with an equivalent neuron count to a dog or a monkey brain is still a good 25 years away, depending on how non-silicon technologies develop, but he believes the field has the potential to change computing. “Mother Nature is a good model,” he says. “She took five billion years. Hopefully we’ll take less.”

*  Further Reading

Borders, W.A., Akimai, H., Fukami, S., Morya, S., Kurihara, S., Horio, Y., Sat, S., and Ohno, H.
Analogue spin-orbit torque device for artificial-neural-network-based associative memory operation, Applied Physics Express, 10, 2016 http://iopscience.iop.org/article/10.7567/APEX.10.013007

Russek, S.E., Donnelly, C.A., Schneider, M.L., Baek, B., Pufall, M.R., Rippard, W.H., Hopkins, P.F., Dresselhaus, P.D., and Benz, S.P.
Stochastic Single Flux Quantum Neuromorphic Computing using Magnetically Tunable Josephson Junctions, IEEE Int’l Conf. on Rebooting Computing, October 2016, San Diego, CA http://ieeexplore.ieee.org/document/7738712/

Prezioso, M., Merrikh-Bayat, F., Hoskins, B.D., Adam, G.C., Likharev, K.K., and Strukov, D.B.
Training and operation of an integrated neuromorphic network based on metal-oxide memristors, Nature, 512, May 2015 http://www.nature.com/nature/journal/v521/n7550/full/nature14441.html

Grollier, J., Querlioz, D., and Stiles, M.D.
Spintronic nano-devices for bio-inspired computing, Proceedings of the IEEE, 104, October 2016 http://ieeexplore.ieee.org/document/7563364/

Stanford engineer creates circuit board that mimics the human brain https://www.youtube.com/watch?v=D3T1tiVcRDs

Back to Top

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More