World-first computer chip inspired by the human brain

Monday, 11 August, 2014


Scientists from IBM have unveiled the first neurosynaptic computer chip to achieve an unprecedented scale of 1 million programmable spiking neurons, 256 million programmable synapses and 46 billion synaptic operations per second per watt. The culmination of almost a decade of research and development, the chip marks a significant step towards bringing cognitive computers to society.

Writing in the journal Science, the researchers explained that all computer chips made today rely on von Neumann architecture, which has been used almost universally since 1946. But while this architecture works well for crunching numbers, it is less efficient for tasks which people and animals perform effortlessly, such as perception and pattern recognition, audio processing and motor control.

“Inspired by the brain’s structure,” the authors said, “we have developed an efficient, scalable and flexible non-von Neumann architecture that leverages contemporary silicon technology.

“The architecture is well suited to many applications that use complex neural networks in real time, for example, multiobject detection and classification.”

The project has been funded by the Defense Advanced Research Projects Agency (DARPA) since 2008 as part of the Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) program. The program was created to speed up the development of a brain-inspired chip that could perform difficult perception and control tasks while at the same time achieving significant energy savings.

The cognitive chip architecture has an on-chip, two-dimensional mesh network of 4096 digital, distributed neurosynaptic cores, where each core module integrates memory, computation and communication, and operates in an event-driven, parallel and fault-tolerant fashion. To enable system scaling beyond single-chip boundaries, adjacent chips, when tiled, can seamlessly connect to each other, building a foundation for future neurosynaptic supercomputers.

The chip was fabricated using Samsung’s 28 nm process technology that has a dense on-chip memory and low-leakage transistors. The event-driven circuit elements of the chip utilise asynchronous design methodology developed at Cornell Tech and refined with IBM. At 5.4 billion transistors, the fully functional and production-scale chip is currently one of the largest CMOS chips ever built; yet while running at biological real time, it consumes only 63 mW - the equivalent of a hearing-aid battery.

“IBM has broken new ground in the field of brain-inspired computers, in terms of a radically new architecture, unprecedented scale, unparalleled power/area/speed efficiency, boundless scalability and innovative design techniques,” said Dr Dharmendra S Modha, IBM Fellow and IBM Chief Scientist, Brain-Inspired Computing, IBM Research.

“These brain-inspired chips could transform mobility, via sensory and intelligent applications that can fit in the palm of your hand but without the need for Wi-Fi.”

The chip’s high energy efficiency makes it a candidate for defence applications such as mobile robots and remote sensors where electrical power is limited. According to DARPA Program Manager Gill Pratt, the chip “could give unmanned aircraft or robotic ground systems with limited power budgets a more refined perception of the environment, distinguishing threats more accurately and reducing the burden on system operators”.

Another potential application is neuroscience modelling. The large number of electronic neurons and synapses in each chip, and the ability to tile multiple chips, could lead to the development of complex, networked neuromorphic simulators for testing network models in neurobiology and deepening current understanding of brain function.

The chip is a component of the SyNAPSE Ecosystem - an end-to-end vertically integrated ecosystem spanning a chip simulator, neuroscience data, supercomputing, neuron specification, programming paradigm, algorithms and applications, and prototype design models. The ecosystem supports all aspects of the programming cycle and signals a shift towards taking in varied kinds of sensory data, analysing and integrating real-time information in a context-dependent way and dealing with the ambiguity found in complex, real-world environments.

According to Cornell Tech Professor Rajit Manohar, “We are now a step closer to building a computer similar to our brain.”

Related Articles

Sample management software supports drug discovery sector

Compounds Australia has selected Titian Software's Mosaic Sample Management platform to...

How AI-enabled embedded modules are advancing medtech

AI has been a longstanding focus in medical technology, predating its adoption in other...

Veterinary LIMS improves laboratory efficiency

The North Dakota State University Veterinary Diagnostic Laboratory has significantly benefited...


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd