IBM Program Language For Brain Computer
August 8, 2013

IBM Develops Language For Brain-Like Computer

Michael Harper for - Your Universe Online

Though modern computers are able to churn through piles of data and power much of our world, they're nowhere near as powerful as the human brain. But for over five years IBM has been hard at work trying to create a computer chip that thinks like a brain in partnership with US defense agency DARPA and others.

Now they say they've moved one step forward in their aim to develop brain-like computers by creating a computer programming language inspired by the human brain. Building a new kind of computer requires a new kind of software, thus IBM had to first create a new language. When the new chips are complete, Big Blue says they'll even be designed like a brain, complete with simulated neurons, axons and synapses all firing at different times.

Once this programming language is paired with the new chips, devices could become truly "smart," allowing them to effectively think on the fly and make real-life decisions based on ever-changing data rather than churning through a set of algorithms and numbers to arrive at one answer.

"It's a very modest goal - it's to build a brain-like computer," Dharmendra Modha, principal investigator and senior manager at IBM Research Dharmendra Modhain told GigaOm.

"We have developed a whole new architecture, so we can't use the language from the previous era. We had to develop a new programming model."

Though the hardware that uses this new language only exists as a model on IBM's Blue Gene supercomputer, it's built as a simulated human brain: 256 processors act as "neurons," which communicate with 256 cores of memory or "axons." A total of 64,000 connections (or synapses) between the processors and memory will keep the cores talking and working together to handle complex algorithms and programs.

The programming language IBM has created is focused on "corelets," or an abstraction of the synaptic cores. According to the IBM press release: "Each corelet represents a complete blueprint of a network of neurosynaptic cores that specifies a based-level function."

The company also says the inner workings of the corelets are hidden away so that developers can only see and use the inputs and outputs. This, says IBM, will allow the developers to focus more on what they want their program to do as opposed to how they want the program to do it. Corelets can also be combined to give developers more power should they need it.

In addition to creating this new language, IBM has also developed an environment in which programmers can test their software. The simulator provides these developers with a "multi-threaded, massively parallel" piece of software which replicates the cognitive computing architecture. Developers can also look to software already created by IBM to make use of the future chips and software design. For instance, some of this sample software includes an application that can "look" at a piece of sheet music and determine whether it's a prelude from Bach or Beethoven symphony.

In a video IBM researcher Bill Risk shows off the "Tumbleweed," a sphere decked with cameras and sensors. This autonomous prototype computer could be used in search and rescue missions, driving itself as it looks for victims and alerting rescue crews of their location. Risk also talked about a device called the "discussion flower" which boasts a series of microphones and video cameras to not only capture and transcribe conversations and meetings, but even react to them. If a conversation is animated, for example, it will open and bloom just like a flower. If there's lots of dead air and monotone voices, the flower will close.