The brain inspires new microchips

For decades, engineers have tried to make computers think like we do. Not in the philosophical sense, but in the most practical sense: get them to process information with the efficiency, flexibility and low energy consumption of the human brain. Because, although artificial intelligence is advancing by leaps and bounds, it does so supported by an infrastructure that consumes colossal amounts of energy. And that’s where biology, once again, offers clues.

A new study, published in Science Advances and led by the University of Cambridge, proposes a solution that seems taken directly from the nervous system: microchips inspired by the brain, capable of learning and processing information much more efficiently. At the center of this advancement is a component called memristora kind of electronic switch that not only regulates the flow of current, but also “remembers” its previous statethus imitating the behavior of neurons and their connections.

To understand the magnitude of the change, it is worth stopping at how current computers work. In traditional chips, data must constantly travel between the memory and the processing unit. That coming and going requires an enormous amount of energy. In contrast, the human brain stores and processes information in the same place: in the connections between neurons. That architecture, honed by millions of years of evolution, is extraordinarily efficient.

The new devices developed in Cambridge replicate precisely that logic. Instead of relying on the usual mechanisms, based on the formation of tiny conductive filaments within materials, the authors, led by Babak Bakhit, have created an ultrathin film of hafnium oxide modified with strontium and titanium. This material allows control the electrical flow in a much more stable and precise way, through small internal electronic “gates”. Just as it happens in neuronal synapses.

The result is a device capable of changing state with currents up to a million times lower than those of comparable technologies. In other words: the same type of operation that occurs in the brain, with a radically lower energy cost. In a context in which artificial intelligence could consume a significant fraction of global electricity, Reducing that spending by up to 70%, as the study suggests, could redefine the future of computing.

“One of the great challenges of current hardware is energy consumption – says Bakhit -. To solve it, we need devices with extremely low currents, great stability and the ability to operate in multiple states“And that is precisely what they have achieved.

But there is something even more interesting than efficiency: the ability to learn. These memristors don’t just store information; They also reproduce basic principles of biological learning, such as the so-called firing time-dependent plasticity (spike-timing dependent plasticity, in English). In simple terms, They can strengthen or weaken their connections depending on when they receive signalsin the same way our neurons do when we learn something new.

This opens the door to a new generation of computing systems: machines that not only execute instructions, but adapt, evolve and learn more organically, literally. It is what is known as neuromorphic computing, a field that seeks to break with the classical model of computing to get closer to the logic of the brain.

However, as with almost all promising developments, there are still obstacles. The main one has to do with manufacturing: the devices require temperatures of about 700 degrees Celsiushigher than those tolerated by standard industrial processes. “It is our biggest challenge right now – Bakhit acknowledges -. But we are working to reduce that temperature and make the technology compatible with the industry.”

Still, the potential is hard to ignore. If these devices can be integrated into commercial chips, They could transform everything from data centers to personal devices, including intelligent sensors or autonomous systems. And, above all, they could make artificial intelligence much more sustainable.

Perhaps the most fascinating thing about this advance is not only its technological impact, but its conceptual origin. At a time when innovation seems to increasingly depend on complexity, this study recalls a simple idea: that The most sophisticated system we know, the human brain, also remains the most efficient. And that, sometimes, moving forward is not about inventing something completely new, but about learning to better imitate what already exists.