New Form of Computing Leads to Solutions in Machine Learning

February 12, 2014

M. Alexander Nugent Consulting of Santa Fe, New Mexico, a private R&D company announces the publication of Alex Nugent and Timothy Molter's PLOS ONE paper "AHaH Computing – FromMetastable Switches to Attractors to Machine Learning". The paper describes a new form of computing based on the attractor dynamics of dissipative systems and details a path from memristor-based circuits to foundational machine learning functions.

Santa Fe, New Mexico (PRWEB) February 12, 2014

A new form of computing based on the attractor dynamics of dissipative systems has been shown to lead to solutions in machine learning and universal logic. In the newly published PLOS ONE paper “AHaH Computing—From Metastable Switches to Attractors to Machine Learning”, authors Alex Nugent and Timothy Molter detail a path from memristor-based circuits to foundational machine learning functions such as pattern classification, prediction, clustering, combinatorial optimization and robotic arm actuation. The main aim of the research is to better understand how nature utilizes the laws of thermodynamics and self-organization to compute. This knowledge is being directed toward the creation of a new type of adaptive neural processing unit (NPU) called “Thermodynamic RAM”.

Nugent and Molter demonstrate that the AHaH Node is a computationally universal building block that can serve as the foundation for a new adaptive computing substrate that meshes memory and processing. This has some big implications in terms of power and space efficiency by addressing the von Neumann Bottleneck.

Although the so-called 'von Neumann Bottleneck' limitation does not noticeably affect traditional programs we are all familiar with like word processors and spreadsheets, it is completely devastating when one attempts to build large scale adaptive learning systems. Molter and Nugent have termed this the “Adaptive Power Problem”. As a typical but recent example, scientists in Japan recently simulated 1% of a human-brain-scale network using the K supercomputer, the fourth most powerful computer in the world. The effort required 705,024 processor cores, 1.4 million GB of RAM and took forty minutes to simulate just one second of brain activity. At 15 watts per core, a full-scale simulation would have consumed over a billion watts. Compare this to a real human brain at just 10 watts and you can see just how big a problem it is.

As Nugent explains: “It's a problem that has to be solved. Just because you have a good machine learning algorithm does not actually mean you have solved the problem. If your algorithm is running on traditional computing architecture then it will be constrained by the bottleneck and the applications are going to be severely limited. If you crack the adaptive power problem, the world of machine learning starts to looks very, very different.”

The proposed chip consists of a hybrid digital and analog architecture using serially connected memristors as synapses, which are activated in parallel across a fractal backbone that enables the synapses to interact and adapt according to the AHaH plasticity rule. A first prototype of the NPU is currently being pursued, and an open-source emulator for application development will be released in late June 2014.

"We have suspected the AHaH plasticity rule was a building block for adaptive machine learning for a number of years now, but only recently have we accumulated enough evidence to be confident…I think it really hit me when I saw the multi-jointed robotic arm moving to grab the ball. At that point we had already demonstrated classification, prediction, clustering, combinatorial optimization and universal logic. That's the point when I realized we just might have a viable building block.

Molter adds: "Developing this technology with Alex has opened my eyes and given me insight into how Nature may accomplish so many seemingly complicated feats. What it boils down to is a simple rule driven by thermodynamics. It could illuminate something about what life is, why fractal flow systems appear again and again everywhere you look, evolution, economics and more. All these systems automatically arrange their structure to maximize energy dissipation and increase entropy."

For the original version on PRWeb visit: http://www.prweb.com/releases/2014/02/prweb11573006.htm

Source: prweb

comments powered by Disqus