News
Computing Profession

Finding the Paradigms ­Underlying Computing in the Brain

Posted
James E. Smith
The human brain "far exceeds anything conventional machine learning has achieved," James Smith told ACM's Federated Research Computing Conference keynote attendees.

In his keynote address on "A Roadmap for Reverse-Architecting the Brain's Neocortex" today at ACM's Federated Research Computing Conference (FCRC), James E. Smith, adjunct professor at Carnegie Mellon University Silicon Valley and professor emeritus at the University of Wisconsin-Madison, cited microelectronics pioneer Carver Mead, who wrote, "There is nothing that is done in the nervous system that we cannot emulate with electronics if we understand the principles of neural information processing." 

Speaking at the Phoenix Convention Center, Smith said the human brain is capable of extremely accurate sensor perception, high-level reasoning, and problem-solving, as well as driving complex motor activity. Among its impressive features are its efficiency, its flexibility in supporting a wide variety of cognitive functions, and its ability to learn dynamically, quickly, and concurrently with its operation. Even today, the brain "far exceeds anything conventional machine learning has achieved."

Understanding the computing paradigms used in the brain's neocortex is a computer architecture research problem of great practical and scientific importance, but as Smith noted, it is an unconventional research problem as it begins with the end-product (an extremely efficient biological computing engine with great capabilities) and works back to those underlying paradigms. While the task is daunting, Smith said, a roadmap for achieving it "can be reduced to exploring natural layers of abstraction."

Smith offered some analysis of the physical architecture of the brain's neocortex, going into the structure and capabilities of biological neurons, as "physical architecture probably corresponds to functional architecture." 

He discussed the connection between architecture and abstraction, since "engineering highly complex systems requires abstraction." Smith said conventional computer architectures contain many levels of abstration, such as the fundamental abstraction connecting electrical circuits to logic gates, and the fundamental abstraction connecting hardware to software. 

Smith offered both long-term and short-term roadmaps, which include consideration of the basic elements making up the neocortical computing architecture, including consideration of temporal coding in comparison with rate coding, as well as consideration of temporal neural networks (TNNs), which he described as "feedforward networks of model neurons." 

He used neural network taxonomy to describe the primary goal of the research, a computing paradigm that learns in an unsupervised, continual, fast, and energy-efficient way." That, Smith said, is what separates his research from what he described as "the vast majority of 'Spiking Neural Network' research. 

He also considered the roles of "Bulk Inhibition," in which "inhibitory neurons act en masse over a local volume of neurons" to create, in effect, a "blanket" of inhibition. He noted that a few inhibitory neurons can control many excitory neurons. Either way, he said, this should be modeled as "a parameterized Winner-Take-All" inhibition."

Smith also identified Spike Timing Dependent Plasticity (STDP) as "where the magic is." He explained in STDP, each synapse updates how it is weighted based on its current weight and local spike time relationship.

After citing a "Pantheon of Neuroscience Architects," theoretical neuroscientists who have been working to develop brain-based computing pardigms for more than two decades (including Simon J. Thorpe, Wolfgang Maass, and Sander Bohte), Smith advised the many students in attendance to consider pursuing research in biological computing. He noted, for example, that the body of literature on temporal neural networks (TNN) is relatively small, and development of TNNs is "not very far along, so there isn't a lot of stuff to learn."

In addition, Smith said, research in this area has "low computational requirements," requiring only a high-end desktop computer. In sum, he said, "it is possible to get up to speed in just a few months at most."

Lawrence M. Fisher is Senior Editor/News for ACM magazines.

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More