"Westworld" is really coming! Scientists are trying to equip AI with a human brain.
The latest progress is led by US national laboratories. Scientists are attempting to turn science fiction into reality: creating a supercomputer occupying just two square meters, with a neuron count comparable to the human cerebral cortex.
Even more astonishing, calculations suggest that this neuromorphic computer could run 250,000 to 1,000,000 times faster than a biological brain, while consuming only 10 kilowatts (slightly more than a household air conditioner's energy consumption). This is undoubtedly a strong stimulant for the current challenges in AI development.
Currently, artificial intelligence is facing an "energy crisis." With the explosive development of technologies like large language models, their astonishing power consumption has become an undeniable heavy burden.
Predictions indicate that by 2027, the electricity cost for merely running these models could reach an astounding $25 trillion—potentially exceeding the US GDP for that year.
In contrast, the most powerful intelligent entity in nature—the human brain—consumes only about 20 watts per day, equivalent to the power of a household LED light bulb. Scientists can't help but wonder: Can AI also be as efficient as the human brain?
The answer is: neuromorphic computing.
This cutting-edge technology, aimed at simulating the structure and operation of the human brain, is being regarded as a key direction for next-generation AI. One of its core objectives is to drive powerful intelligence with "light bulb-level" energy consumption.
Neuromorphic Computing: Learning from the Brain
In the human brain, approximately 86 billion complex neurons work together, forming a vast signal transmission network through 100 trillion synapses.
Inspired by its structure and function, neuromorphic computing employs energy-efficient electronic and photonic networks that mimic biological neural networks, known as Spiking Neural Networks (SNNs), aiming to integrate memory, processing, and learning into a unified design.
Its main characteristics include:
1. Event-Driven Communication: Activates only necessary circuits based on peaks and events, thereby reducing power consumption.
2. In-Memory Computing: Data processing occurs at the storage location to reduce transmission latency.
3. Adaptability: The system learns and evolves over time on its own, without the need for centralized updates.
4. Scalability: The architecture of neuromorphic systems allows for easy expansion, accommodating broader and more complex networks without significantly increasing resource requirements.
Unlike current AI models that rely on binary supercomputers for processing, neuromorphic computing can dynamically adjust based on its perception of the world, making it smarter, more flexible, and less susceptible to interference.
For example, when a tester wearing a T-shirt with a stop sign printed on it walked in front of an autonomous vehicle, the car controlled by traditional AI reacted by stopping because it couldn't discern the context.
In contrast, a neuromorphic computer processes information through feedback loops and context-driven verification. It can clearly determine that the stop sign is on the T-shirt, allowing the car to continue driving.
This difference is not surprising; after all, neuromorphic computing simulates the most efficient and powerful reasoning and prediction engine in nature. Scientists therefore believe that the next wave of technological breakthroughs in artificial intelligence will undoubtedly be a combination of physics and neuroscience.
A New Round of Technological Revolution Outlook
Currently, related research is progressing rapidly. Existing neuromorphic computers possess over a billion neurons connected by over 100 billion synapses. While this is still a drop in the ocean compared to the complexity of the human brain, it reasonably demonstrates that this technology can achieve brain-scale expansion.
Jeff Shainline from the National Institute of Standards and Technology stated: "Once we can achieve the full process of creating networks in commercial foundries, we can rapidly scale up to very large systems. If you can make one neuron, it's quite easy to make a million neurons."
Tech companies like IBM and Intel are at the forefront of this technological revolution. IBM's TrueNorth chip developed in 2014 and Intel's Loihi chip launched in 2018 are both hardware products designed to simulate brain neural activity, paving the way for subsequent new AI models.
Furthermore, several startups focusing on neuromorphic computing are also emerging. For example, BrainChip has introduced the Akida neuromorphic processor, specifically designed for low-power but powerful Edge AI, which can be widely applied in always-on smart homes, factories, or city sensors.
Meanwhile, The Business Research Company predicts that by 2025, the global neuromorphic computing market size will grow exponentially, reaching $1.81 billion, with a compound annual growth rate as high as 25.7%.
From a longer-term perspective, scientists hope that neuromorphic computing will transcend the traditional boundaries of artificial intelligence, move closer to human intelligent reasoning patterns, and bring new technological breakthroughs for next-generation intelligent systems and even AGI.
References:
[1]https://www.lanl.gov/media/publications/1663/1269-neuromorphic-computing
[2]https://news.ycombinator.com/item?id=44194469
[3]https://www.linkedin.com/pulse/neuromorphic-computing-end-quantum-computingor-even-human-yjhac/
[4]https://www.prodigitalweb.com/neuromorphic-computing-brain-inspired-ai/