Researchers at the Salk Institute trained a standard recurrent neural network and then transferred those parameters to a spiking neural network.
Their goal is to resolve the fact that spiking neurons currently have no way to be trained via gradient descent—the basis of conventional machine learning.
The new research is focusing on a form of what is known as "transfer learning," developing parameters in one place and moving them to a new place, in order to avoid the shortcomings of spiking neurons.
Separately, researchers at the U.S. Defense Advanced Research Projects Agency (DARPA) have developed a Python-based programming package called BindsNET, which can perform a kind of transfer learning similar to that of the Salk Institute project.
The DARPA researchers used BindsNET to simulate the construction of shallow artificial neural networks made up of spiking neurons.
Both the Salk Institute and DARPA projects show there is energy and intelligence active within the spiking neuron realm of neuromorphic computing.
View Full Article
Abstracts Copyright © 2019 SmithBucklin, Washington, DC, USA
No entries found