Quantcast
Last updated on April 24, 2014 at 5:49 EDT

Scientists Look To Neurons For New Computing Functionality

July 24, 2010

British researchers are working to develop new, innovative computers by mimicking the way neurons communicate, in hopes the approach will lead to improvements in visual and audio processing, BBC news reported on Friday.

Such enhancements could result in computers that can learn to see or to hear, instead of relying upon sensors.

The work is simultaneously helping researchers improve their understanding of how nerve cells operate.

“ËœSmart’ computing

Although artificial neural networks have been around for decades, they do not closely replicate true neurons.

However, the current project, coordinated by University of Plymouth computer scientist Dr. Thomas Wennekers, seeks to model the unique physiological way that neurons in a specific part of the brain communicate.

“We want to learn from biology to build future computers,” Dr. Wennekers told BBC News.

“The brain is much more complex than the neural networks that have been implemented so far.”

Preliminary work on the project has involved collecting data about neurons and how they are connected in a certain part of the brain.  Specifically, the researchers are working on the laminar microcircuitry of the neocortex, which is involved in higher brain functions such as hearing and sight.

The data collected so far has been used to create highly detailed simulations of groups of nerve cells and microcircuits of neurons that are distributed across larger scale structures such as the visual cortex.

“We build pretty detailed models of the visual cortex and study specific properties of the microcircuits,” Dr. Wennekers said.

“We’re working out which aspects are crucial for certain functional properties like object or word recognition, he said, adding that he hopes the project will result in more than just improved sensory networks.

“It might lead to smart components that are intelligent,” he said.

“They may have added cognitive components such as memory and decision making,” he said, adding that someday computers may even be endowed with emotions.

“We’ll be computing in a completely different way.”

Hardware Challenges

While Dr. Wennekers and his team conduct their work primarily with software simulations, Professor Steve Furber is using neurons to produce innovative new hardware.

Known as “ËœSpinnaker’, Professor Furber’s project is attempting to build a computer uniquely optimized to run similar to the way human biology does.

The Spinnaker system, which utilizes the ARM processors, simulates in hardware the workings of large number of neurons.

“We’ve got models of biological spiking neurons,” Professor Furber told BBC News.

“Neurons whose only communication with the rest of the world is that they go ping. When it goes ping it lobs a packet into a small computer network.”

Each of Spinnaker’s ARM chips runs about 1,000 neuron models.  The current version of Spinnaker uses an eight-processor system, although the team is in the final stages of designing the chip with 18 ARM processors on board, 16 of which will model neurons, Furber said.

The final goal is a system that controls one billion neurons on a million ARM processors, he said.

“The primary objective is just to understand what’s happening in the biology.”

“Our understanding of processing in the brain is extremely thin.”

The researchers hope the simulation will lead to groundbreaking processing systems, and will allow them to gain new insights into the way that many computational elements can be networked together.

“The computer industry is faced with no future other than parallel,” Professor Furber said.

However, the industry lacks a fundamental understanding of how to get the most from all of those computational elements, he said.

The major problem was determining how to run the system without being overwhelmed by the management overhead of coordinating all of those processors, he said.

Spinnaker might offer a way to conquer some of these challenges as the individual elements will be far smaller than the massive processors now in use, and will self-organize to a certain degree.

It will also offer benefits in the form of lower power consumption.

“We think there’s a change in the game there,” Professor Furber said.

On the Net: