Intel unveiled on Thursday its Loihi 2 chip, the second generation of a processor that marries conventional electronics with the architecture of human brains to try to inject some new progress into the computing industry.
Loihi 2, an example of a technology called neuromorphic computing, is about 10 times faster than its predecessor, according to Intel. The speed improvement is the result of an eightfold increase in the number of digital neurons, a chip equivalent to human brain cells that mimic the way the brains handle information. The chip also can be programmed better to help researchers tackle more computing tasks.
The chip is built with a preproduction version of the Intel 4 manufacturing process, too, an advanced method Intel plans to use to build mainstream Intel chips arriving in 2023. The Intel 4 process can etch electronics more densely on a chip, a crucial advantage for Intel’s need to pack a million digital neurons on a chip measuring 30 square millimeters.
Loihi chips are particularly good at rapidly spotting sensory input like gestures, sounds, and even smells, says Mike Davies, leader of the Intel Labs group that developed Loihi. Some experiments have focused on artificial skin that could give robots a better sense of touch. “We can detect slippage if a robot hand is picking up a cup,” Davies said.
Neuromorphic computing differs from artificial intelligence, a revolutionary computer technology based more loosely on how brains learn and respond because it focuses more on the physical characteristics of human gray matter.
It differs from conventional chips in profound ways. For example, Loihi 2 stores data in tiny amounts spread across its mesh of neurons, not in a big bank of traditional computer memory, and it doesn’t have a central clock ticking to synchronize computing steps on the chip.
You won’t see Loihi 2 on your phone or laptop. Instead, it’s geared for researchers at automakers, national labs, and universities. Germany’s Deutsche Bahn railway network is testing how well it can optimize train schedules. The processor is geared for tasks such as processing sound or detecting hand gestures, but with vastly lower power consumption, Davies said.
Low power use is a characteristic of biological gray matter, too. Human brains are made of about 80 billion cells called neurons, connected into elaborate electrical signaling networks. When enough input signals reach an individual neuron, it fires its own signal to other neurons. The topology of the connections and flow of signals lets us do everything from recognizing Abraham Lincoln to riding a bicycle. Learning is the process of establishing and reinforcing those connections.
Intel isn’t the only one pursuing the idea. The Human Brain Project in Europe includes neuromorphic computing in its work. The way blood courses through the brain-inspired IBM to power and cool chips with liquids in a flow battery. Samsung used IBM’s neuromorphic TrueNorth chip to re-create vision.
Intel’s chip is made of a million digital neurons that can be connected in any number of ways, a digital tabula rasa. Getting it to work requires configuring the proper connections between neurons. Actual processing occurs when input data reaches the chip, triggering a spike of activity that flows through the interconnected neurons and eventually produces an output. Each neuron is connected to 100 others on average, though some may reach as many as 10,000.
This flow-like design means the chip requires very little power when idle and can process data very quickly on demand, Davies said.
Fewer but smarter neurons
A million neurons in one chip are far from the billions in a human brain, but Intel is effectively trying to make each neuron smarter than a biological brain cell. For example, in biological brains, electrical signals are either fully on or fully off. In Loihi chips, Intel can assign a different strength to each signal, increasing processing sophistication, Davies said.
The chip can be connected to others, too, for greater scale. One improvement over the first Loihi is better networking that shortens the communication pathways that link neurons.
“The brain achieves accuracy and reliability through tremendous redundancy,” Davies said. “The hope is indeed we can solve some of the same problems in a more economical way.”Source: CNET