SpikingBrain: China’s bid for greener, faster AI

A futuristic eco-inspired neural network glowing like neurons, blending technology and nature with green energy veins and bioluminescent forest light.

Chinese researchers have unveiled SpikingBrain, a new kind of artificial intelligence that mimics how the human brain works. Unlike the systems behind tools such as ChatGPT, SpikingBrain activates only the “neurons” it needs, rather than processing everything at once.

The team says this makes it faster, more energy-efficient, and able to run on Chinese-made chips instead of the Nvidia processors that dominate the global AI industry but are restricted for export to China.

Moving beyond Transformers

Today’s leading AI models are built on an architecture known as Transformers. These systems process language in parallel, which makes them powerful but also extremely demanding in terms of computing power and energy. Running and training them requires vast data centres and contributes to high carbon emissions.

SpikingBrain works differently. Instead of keeping the whole network active, it uses a “spiking” approach that mirrors the way human brains function: most of the system stays quiet until something triggers a response. This means it can handle very long documents or complex tasks while using far less power.

Why it matters for energy use

AI’s environmental footprint has become a growing concern. Training a single large model can consume as much electricity as hundreds of households use in a year.

SpikingBrain’s design aims to change that. By skipping unnecessary calculations, the system is reported to cut energy use significantly. If widely adopted, this could ease pressure on data centres and even make it possible to run advanced AI directly on laptops or mobile phones, reducing the need for energy-hungry cloud servers.

Built without Nvidia

Perhaps just as significant is the hardware. SpikingBrain was trained entirely on MetaX processors, chips made by a Chinese company. Most of the world’s advanced AI systems depend on Nvidia hardware, so this marks a step towards independence for China’s technology sector.

Researchers say the models trained reliably across hundreds of MetaX chips, proving that large-scale AI can be built without relying on Nvidia.

How fast is it?

Early reports suggest the system can be extremely quick. In one test, a smaller version of SpikingBrain processed a massive text input more than 100 times faster than a conventional model. A larger version, designed for more complex tasks, aims to combine efficiency with accuracy.

However, the research has not yet been peer-reviewed, and outside experts urge caution. “The idea is exciting,” one UK-based AI researcher told the BBC, “but independent testing is essential before we know how well it really performs.”

Looking ahead

If the claims are confirmed, SpikingBrain could reshape the way AI is built and used. Lower energy use would help the industry align with climate goals, and relying on domestic hardware could shift the global balance of power in artificial intelligence.

The project is still experimental, but it reflects a broader move towards brain-inspired systems that focus on efficiency as much as size. As AI continues to spread into daily life, the future may lie not in building ever-larger models, but in creating ones that are smarter, leaner and greener.