Jumat, 26 Agustus 2011

IBM Reveals the Biggest Artificial Brain of All Time - Popular Mechanics

San Jose, Calif.--Scientists at IBM's Almaden research center have built the biggest artificial brain ever--a cell-by-cell simulation of the human visual cortex: 1.6 billion virtual neurons connected by 9 trillion synapses. This computer simulation, as large as a cat's brain, blows away the previous record--a simulated rat's brain with 55 million neurons--built by the same team two years ago.


"This is a Hubble Telescope of the mind, a linear accelerator of the brain," says Dharmendra Modha, the Almaden computer scientist who will announce the feat at the Supercomputing 2009 conference in Portland, Ore. In other words, in the realm of computer science, the team's undertaking is grand.

The cortex, the wrinkly outer layer of the brain, performs most of the higher functions that make humans human, from recognizing faces and speech to choreographing the dozens of muscle contractions involved in a perfect tennis serve. It does this using a universal neural circuit called a microcolumn, repeated over and over. Modha hopes the simulation, assembled using neuroscience data from rats, cats, monkeys and humans, will help scientists better understand how the brain works--and, in particular, how the cortical microcolumn manages to perform such a wide range of tasks.

But deciphering the microcolumn can also help build better computers, Mars rovers and robots that are truly intelligent. By reverse engineering this cortical structure, Modha says, researchers could give machines the ability to interpret biological senses such as sight, hearing and touch. And artificial machine brains could process, intelligently, senses that don't currently exist in the natural world, such as radar and laser range-finding.

"Imagine peppering the entire surface of the ocean with pressure, temperature, humidity, wave height and turbidity sensors," Modha says. "Imagine streaming this data to a reverse-engineered cortex." In short, he envisions wiring the entire planet--transforming it into a virtual organism with the capacity to understand its own evolving patterns of weather, climate and ocean currents.

The simulation that Modha will unveil today is just a starting point. It lacks the neural patterning that develops as real brains mature. Neuroscientists believe that this complexity can only evolve through "embodied learning"--stumbling around in a physical body, in which every action has instant consequences that are experienced through senses such as touch and sight. As Anil Seth, a neuroscientist at the University of Sussex in Britain, puts it, "The brain wires itself."

Seth demonstrated this principle while at the Neurosciences Institute in San Diego using a brain simulation called Darwin. He embodied Darwin's 50,000 virtual neurons (about equal to the brain of a pond snail, or one-quarter of a fruit fly) in a wheeled robot. As Darwin wandered around, its virtual neurons rewired their connections to produce so-called hippocampal "place cells"--similar to neurons found in mammals--which helped it navigate. Scientists don't know how to program these place cells, but with embodied learning the cells emerge on their own.

Paul Maglio, a cognitive scientist at Almaden, has similar plans for Modha's cortical simulation. He's building a virtual world for it to inhabit using software from the video shootout game "Unreal Tournament" and data from Mars. Besides topographic maps and aerial photos, Maglio plans to use rover-level imagery to create terrain with lifelike boulders and craters.

The video-game software provides a pallet of several dozen robotic bodies for Modha's virtual cortex. Initially, it will use a simple wheeled robot to explore its world, driven by fundamental desires such as sustenance and survival. "It's got to like some things and not like other things," Maglio says. "Ultimately, it's going to want not to roll off the edges of cliffs."

Modha's billion-neuron virtual cortex is so massive that running it required one of the fastest supercomputers in the world--Dawn, a Blue Gene/P supercomputer at Lawrence Livermore National Laboratory (LLNL) in California.

Dawn hums and breathes inside an acre-size room on the second floor of the lab's Terascale Simulation Facility. Its 147,456 processors and 147,000 gigabytes of memory fill 10 rows of computer racks, woven together by miles of cable. Dawn devours a million watts of electricity through power cords as thick as a bouncer's wrists--racking up an annual power bill of $1 million. The roar of refrigeration fans fills the air: 6675 tons of air-conditioning hardware labor to dissipate Dawn's body heat, blowing 2.7 million cubic feet of chilled air through the room every minute.

Dawn was installed earlier this year by the Department of Energy's National Nuclear Security Administration (NNSA), which conducts massive computer simulations to ensure the readiness of the nation's nuclear weapons arsenal. Modha's team worked with Dawn for a week before it was transitioned to NNSA's classified nuclear work. For all of its legendary computing power, Dawn still ran Modha's 1.6 billion neurons at only one-six-hundredth the speed of a living brain. A second simulation, with 1 billion neurons, ran a little faster--but still only at one-eighty-third of normal brain speed.

These massive simulations are merely steps toward Modha's ultimate goal: simulating the entire human cortex, about 25 billion neurons, at full speed. To do that, he'll need to find 1000 times more computing power. At the rate that supercomputers have expanded over the last 20 years, that super-super computer could exist by 2019. "This is not just possible, it's inevitable," Modha says. "This will happen."

But it won't be easy. "Business as usual won't get us there," says Mike McCoy, head of advanced simulation and computing at LLNL. Development of supercomputers in recent decades has ridden the wave of Moore's law: transistors shrank and the computing power of processor chips doubled every 18 months. But that wild ride is coming to an end. Transistors are now packed so densely on chips that the heat they generate can no longer be dissipated. To reduce heat, Dawn uses older, larger, 180-nanometer transistors that were developed 10 years ago--rather than the 45-nanometer transistors that are used in desktop computers today. And for the same reason, Dawn runs these transistors at a sluggish 850 megahertz--three times slower than today's desktop computers.

The supercomputer that Modha needs to simulate a whole cortex would also consume prohibitive amounts of power. "If you scale up current technology, this system might require between 100 megawatts and a gigawatt of power," says Horst Simon, a mathematician at nearby Lawrence Berkeley National Laboratory, who collaborated with Modha on the simulation. One gigawatt (a billion watts) is the amount of power that the mad scientist Emmett "Doc" Brown needed to operate his DeLorean time machine in the 1985 movie "Back to the Future." But Simon puts it more bluntly: "It would be a nuclear power plant," he says. The electricity alone would cost $1 billion per year.

The human brain, by comparison, survives on just 20 watts. Although supercomputer simulations are power-hungry, Modha hopes that the insights they provide will eventually pave the way to more elegant technology. With funding from DARPA (Defense Advanced Research Projects Agency), he's working with a far-flung team at five universities and four IBM labs to create a new computer chip that can mimic the cortex using far less power than a computer. "I'll have it ready for you within the next decade," he says. 
source: http://www.popularmechanics.com/technology/engineering/extreme-machines/4337190?page=1

0 komentar:

Posting Komentar