Computer science is getting stunningly close to granting the wish of the Scarecrow, not to mention the needs of the modern soldier.
The Pentagon has long sought what the Wizard of Oz could not manufacture: a brain, or at least an electronic cognition machine that operates as closely as possible to the speed and efficiency of the human cortex.
A coalition of IBM’s research institutes and several universities and government labs has delivered a preliminary answer to that request: a 5.4-billion transistor chip with 1 million programmable neurons and 256 million synapses. The TrueNorth chip is the size of a postage stamp and is more than 1,000 times as energy efficient as a conventional chip, according to a study published online Thursday in the journal Science.
Don’t expect to see the tiny supercomputer on your smartphone anytime soon, although the lead researcher said his team is gaining momentum in that direction. He envisions a world populated with sensors that can process data at brain-like speeds, serving such roles as guides for the blind or instant detectors of industrial toxins.
“The impossible has become possible, and the next step is to make possible real, in terms of commercial applications,” said Dharmendra Modha, head of the cognitive computing group at IBM Almaden Research Center.
Modha and many others have been metaphorically whistling “If I only had a brain” for decades. That’s because, for all the exponential advances in processing speed, materials and manufacturing, digital computing relies on architecture rooted in the 1940s. It has a well-known “bottleneck” between the processor and memory, named for the architect himself, John von Neumann.
Supercomputers that have hurdled the von Neumann bottleneck have accomplished stunning feats, including trivializing a “Jeopardy!” champion. But supercomputers also can have energy requirements that vie with some municipalities, and have grown larger than the laboratory-sized calculating machines of the infancy of computers.
The human brain, meanwhile, uses roughly 20 watts and occupies the same volume as two cylinders of an old Harley-Davidson motorcycle (74 cubic inches).
“We have instrumented the planet with cameras, microphones, smartphones, a variety of sensors and the data is coming at us fast and furious,” Modha said. “Asking today’s computers to understand this sensory tsunami is architecturally very, very expensive.”
Multiple efforts to mimic the architecture of the human brain on silicon have been under way for several years. This one, dubbed SyNAPSE, received $53.5 million from the Defense Advanced Research Projects Agency, or DARPA. In 2011, it unveiled a miniscule “core” of 256 neurons and 262,000 synapses. TrueNorth builds those cores into a system that can be scaled up virtually without limits, mimicking the way interlaced neurons relay information via “spikes” in activity.
Like the brain, TrueNorth is event-driven. It conserves energy by doing only what is necessary to the task at the time it’s necessary, and no more — unlike conventional processors.
“It doesn’t have to run all the time,” Modha said. “It’s very parsimonious, like nature.”
Modha said he was holding one of the prototype chips in the palm of his hand, and his excitement was such that he dropped it. He seemed unperturbed. Then again, he was so confident of the team’s design, he put its manufacture on a one-month hiatus and promised a $1,000 bottle of champagne to any member who could suss out a flaw in the design.
“No one claimed it; we had cheap wine for everyone after a month,” said Modha. “The chip came back from fabrication and worked flawlessly.”
Researchers tested the chip by running a program to detect and identify people and vehicles while they move in a complicated environment — namely, Stanford University’s campus. It passed, and did so while consuming fractions of the energy of supercomputer racks.
That’s what interests DARPA, which seemed pleased Thursday with its investment so far:
The chip “could give unmanned aircraft or robotic ground systems with limited power budgets a more refined perception of the environment, distinguishing threats more accurately and reducing the burden on system operators,” said Gill Pratt, DARPA program manager.
Think: drones with brains.
Terrence Sejnowski, head of the computational neurobiology lab at Salk Institute for Biological Studies, said the chip could prove invaluable to researchers. “The future is finding a path to low-power computing that solves problems in sensing and moving — what we do so well, and digital computers do so awkwardly.”
Other critics, however, noted that the chip has yet to be tested in situations such as those confronted on a battlefield, and some questioned whether neurological architecture will ever replace more conventional graphics processing units in consumer-level devices.
Modha cautioned that his team is not aiming to create an artificial brain, nor to replace von Neumann computers. One is impossible, the other impractical, he said.
And the ability to learn by adapting and changing structure and function remains a purely cerebral talent that has not been fully tested in the new chip. Still, Modha said that capacity “is on our horizon.”