ANI

Wetware AI: Living Brain Cells Trained to Run Chaos Maths

Summary: The line between biology and computer science has recently become even more blurred. Researchers have successfully trained mouse neurons to perform complex machine learning tasks. The study integrated advanced neuronal networks into a “reservoir computing” framework.

Using a technique called FORCE learning, the team taught these biological cycles to produce complex mathematical patterns—including the chaotic Lorenz attractor—proving that living “wetware” can serve as an efficient, real-time computing device.

Important Facts

  • Reservoir Computing: This framework uses the “natural” wastefulness and complexity of the network (reservoir) to process data. Instead of training every single neuron, scientists train only the “learning” layer that defines the network's function.
  • FORCE Reading: A technique used to correct output signals in real time based on errors. It is the first time that it has been successfully used in a Biological Neural Network (BNN) generating time series data.
  • “Chaos” test: Living neurons didn't just learn simple sine waves; they successfully reproduce i Lorenz is attractivea complex set of equations used to model chaotic systems such as weather patterns.
  • Microfluidic Precision: Researchers have used tiny “pipes” (microfluidics) to direct how neurons grow. By creating modular “neighborhoods” of cells, they prevent neurons from firing simultaneously (synchronization), which is essential for high-performance computing.
  • Diversity: The same biological system was flexible enough to read waves with periods ranging from 4 to 30 seconds, showing that living networks are remarkably flexible.

Source: Tohoku University

A team of researchers at Tohoku University and Future University Hakodate has shown that living neurons can be trained to perform a supervised temporal pattern learning task previously performed by artificial systems.

By integrating advanced neuronal networks into a machine learning framework, the team demonstrated that these biological systems can generate complex time series signals, marking an important step forward in both neuroscience and bio-inspired computing.

The study was published online by Proceedings of the National Academy of Sciences (PNAS) on March 12, 2026, highlighting the novel intersection between living neural systems and computational technology. The findings suggest that biological neural networks (BNNs) may serve as effective alternatives to or complement existing machine learning models.

Artificial neural networks (ANNs) and spiking neural networks (SNNs) have long been used in machine learning and neuromorphic hardware. A framework known as reservoir computing has emerged as an efficient way to process time-dependent data by exploiting the dynamic properties of recurrently connected ANNs and SNNs.

In conventional ANN-based reservoir computing, methods such as First-Order Reduced and Controlled Error (FORCE) learning allow real-time adaptation by continuously adjusting output signals in response to errors.

These techniques allow artificial systems to generate many types of temporal patterns, including periodic and chaotic signals. However, whether similar methods can be applied to biological neural networks remains an open question.

To address this gap, researchers have constructed biological neural networks using cultured rat cortical neurons and embedded them in a lake computer framework.

By using FORCE learning to improve the learning layer of the system, the team successfully trained biological networks to generate complex temporal signals such as those involved in movement control.

A key innovation in the research was the use of microfluidic devices to precisely direct neuronal growth and control network connectivity. This approach enables researchers to create modular network architectures that minimize redundant synchronization, thereby developing the rich, high-dimensional capabilities required for efficient reservoir computing.

Using this system, the BNN-based framework was able to generate various time series patterns, including sine waves, triangle waves, square waves, and Lorenz-like chaotic patterns. Remarkably, the network showed flexibility by learning and stably reproducing sine waves with periods ranging from 4 to 30 seconds within the same system.

“This work shows that living neuronal networks are not only important biological systems but may also serve as computing resources,” said Hideaki Yamamoto, a professor at Tohoku University.

“By combining neuroscience and machine learning, we are paving the way to new forms of computing that leverage the intrinsic capabilities of biological systems.”

Looking ahead, the research team aims to improve the stability of the signal generation after the end of the training. Future efforts will focus on reducing response latency and refining the FORCE learning algorithm. In parallel, the platform may be expanded into a microphysiological system for studying drug responses and modeling neurological disorders, further expanding its impact in all fields of science and medicine.

Important Questions Answered:

Q: Are we basically building “Cyborg” computers now?

A: We're getting there! This is called “Wetware Computing.” Unlike traditional silicon chips, these reservoirs use the intrinsic, “sound” of living cells to solve problems. They are incredibly energy efficient and can adapt to new information in ways that robust AI models often struggle with.

Q: How do you “teach” a cell dish to do math?

A: It's like a conductor leading an orchestra. The “reservoir” of neurons now plays a million different notes. Researchers use FORCE learning to listen to those notes and “score” those that match the pattern they want (like a sine wave). Over time, the output layer learns exactly which neurons to “listen” to in order to get the correct result.

Q: What is the advantage of using real neurons over traditional AI?

A: Biology is the last major parallel processing. A single biological network can handle large amounts of time-dependent data with very little power. Additionally, these systems can be used to test how drugs affect “thinking” regions or to model neurological diseases in a container without requiring animal testing.

Editor's Notes:

  • This article was edited by a Neuroscience News editor.
  • The journal paper is fully revised.
  • More content has been added by our staff.

About this AI and neuroscience news

Author: Office of Public Relations
Source: Tohoku University
Contact person: Public Relations Office – Tohoku University
Image: Image posted in Neuroscience News

Actual research: Open access.
“Online supervised learning of temporal patterns in biological neural networks under feedback control” by Yuki Sono, Hideaki Yamamoto, Yusei Nishi, Takuma Sumi, Yuya Sato, Ayumi Hirano-Iwata, Yuichi Katori, and Shigeo Sato. PNAS
DOI:10.1073/pnas.2521560123


Abstract

Online supervised learning of temporal patterns in biological neural networks under feedback control

In vitro biological neural networks (BNNs) provide well-defined model systems to constructively investigate how living cells interact with their environment to shape dynamic high-dimensional dynamics that can be used to produce coherent temporal effects, such as those required for movement control.

Here, we develop a closed-loop real-time BNN system capable of generating periodic and chaotic signals by integrating cortical neurons cultured with microfluidic devices and high-density microelectrodes.

We show that training a simple linear recorder with constant feedback weights allows the system to learn and automatically generate various temporal patterns. When the feedback is turned on, the irregular activity in BNNs is transformed into a low-dimensional, structured dynamic, producing coherent trajectories characterized by stable transitions between different emotional states.

BNNs trained at different target frequencies—from 4 to 30 s—can be trained to support oscillations at different frequencies, showing their flexibility. Importantly, top-down control of self-organizing network structures with microfluidic devices is key to suppressing over-synchronization and increasing dynamic complexity in BNNs, facilitating the training process and the production of robust results.

This work provides a biologically inspired platform for understanding the physical basis of computational computing and for advancing energy-efficient neuromorphic computing paradigms.

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button