Sign In

Communications of the ACM

ACM Careers

Scientists Develop the Next Generation of Reservoir Computing


View as: Print Mobile App Share: Send by email Share on reddit Share on StumbleUpon Share on Hacker News Share on Tweeter Share on Facebook
artificial neural network, illustration

Artificial neural networks are at the heart of reservoir computing.

Researchers have found a way to make reservoir computing, a machine learning approach to processing that requires small training data sets and minimal computing resources, work between 33 and a million times faster than a traditional reservoir computer.

In one test of next-generation reservoir computing, researchers solved a complex computing problem in less than a second on a desktop computer, a task that would otherwise require a supercomputer, says Daniel Gauthier, professor of physics at Ohio State University. "We can perform very complex information processing tasks in a fraction of the time using much less computer resources compared to what reservoir computing can currently do," he says.

The system is described in "Next Generation Reservoir Computing," published in the journal Nature Communications.

Using an artificial neural network, scientists fed data on a dynamical network into a "reservoir" of randomly connected artificial neurons in a network. The network produces useful output that the scientists can interpret and feed back into the network, building a more and more accurate forecast of how the system will evolve in the future.

From Ohio State University
View Full Article


 

No entries found