A research team from Tohoku University and Future University Hakodate has demonstrated that living biological neurons can be trained to perform supervised temporal pattern learning tasks that were previously performed by artificial systems. By integrating cultured neuronal networks into a machine learning framework, the research team showed that these biological systems can generate complex time-series signals, a major advance in both neuroscience and bio-inspired computing.
The study, published online in the Proceedings of the National Academy of Sciences (PNAS) on March 12, 2026, highlights a new interface between living neural systems and computational techniques. This finding suggests that biological neural networks (BNNs) may serve as a viable alternative or complement to existing machine learning models.
Artificial neural networks (ANN) and spiking neural networks (SNN) have long been used in machine learning and neuromorphic hardware. A framework known as reservoir computing has emerged as an efficient approach for processing time-dependent data by exploiting the dynamic properties of recursively connected ANNs and SNNs.
In traditional ANN-based reservoir computing, methods such as first-order reduced control error (FORCE) learning enable real-time adaptation by continuously adjusting the output signal in response to errors. These techniques allow artificial systems to generate a wide range of temporal patterns, including periodic and chaotic signals. However, whether a similar approach can be applied to biological neural networks remains an open question.
To address this gap, the researchers used cultured rat cortical neurons to build a biological neural network and incorporated it into a reservoir computing framework. By applying FORCE learning to optimize the system’s readout layer, the researchers were able to train a biological network to generate complex temporal signals comparable to those involved in motor control.
A key innovation in this study was the use of microfluidic devices to precisely guide neuron growth and control network connectivity. This approach allowed the researchers to create a modular network architecture that minimizes excessive synchronization, thereby facilitating the rich, high-dimensional dynamics required for effective reservoir computing.
Using this system, the BNN-based framework was able to generate a variety of time series patterns, including chaotic trajectories such as sinusoids, triangle waves, square waves, and even Lorentz attractors. In particular, this network demonstrated its flexibility by learning and stably reproducing sine waves with periods ranging from 4 to 30 seconds within the same system.
This study shows that living neuronal networks are not only biologically meaningful systems, but also have the potential to serve as novel computational resources. By bridging neuroscience and machine learning, we are paving the way for new forms of computing that exploit the unique dynamics of biological systems. ”
Hideaki Yamamoto Professor, Tohoku University
Looking ahead, the research team aims to improve the stability of signal generation after training. Future work will focus on reducing the feedback delay and improving the FORCE learning algorithm. In parallel, this platform could be extended to microphysiological systems for studying drug responses and modeling neurological disorders, further expanding its impact in both scientific and medical fields.
sauce:
Reference magazines:
Yuya Sono others. (2026). Online supervised learning of temporal patterns in biological neural networks under feedback control. Proceedings of the National Academy of Sciences. DOI: 10.1073/pnas.2521560123. https://www.pnas.org/doi/10.1073/pnas.2521560123

