Reading the Mind in Real Time
A New Brain–Computer Interface Aims to Transform Mental Health Care
Bar-Ilan University researchers are developing a groundbreaking system that combines virtual reality, brain signals, and machine learning to monitor emotional states, and intervene when it matters most.
One of the biggest challenges in mental health care today isn’t diagnosis, it’s timing. Emotional states fluctuate constantly, yet clinicians still rely largely on retrospective self-reports and questionnaires that capture only fragments of a person’s inner world, often weeks after the fact. Subtle shifts in mood, early warning signs of emotional distress, and opportunities for timely intervention frequently go unnoticed.
A new interdisciplinary project at Bar-Ilan University aims to change that.
Led by Dr. Yaara Erez from the Faculty of Engineering and Dr. Hanna Keren from the Faculty of Medicine in the Galilee, the project is developing a non-invasive brain–computer interface (BCI) that can assess emotional states in real time, and respond accordingly. The innovative collaboration has been awarded a competitive grant from the Ministry of Innovation, Science and Technology as part of its 2025 call for proposals in health and medicine, and was selected from 84 submissions in the mental health technologies category.
Why Mental Health Needs Better Tools
“Today, mood assessment is based primarily on subjective questionnaires with very low time resolution,” explains Dr. Keren. “They often describe how a person felt over the past few weeks, which makes it almost impossible to track rapid emotional changes or offer real-time support. That limitation has a direct impact on the quality of care patients receive.”
The project addresses this gap by shifting mental health assessment from memory-based reporting to continuous, objective monitoring using the body and brain as real-time sources of data.
Enter Virtual Reality: Emotion as a Dynamic System
At the heart of the system is an adaptive virtual reality environment developed in Dr. Keren’s lab. Participants enter a changing VR world, such as a walk through a park, where elements like lighting, sound, and weather are continuously adjusted.
“These environments aren’t static,” says Dr. Keren. “They’re designed to influence mood in real time using control algorithms borrowed from engineering. The system responds dynamically to the user’s emotional state, creating a closed feedback loop between experience and emotion.”
As users move through the virtual environment, the system simultaneously collects neural and physiological signals from multiple sensors. These include EEG activity, heart rate, skin conductance, and eye movements, biological indicators known to shift rapidly with emotional states. For example, spikes in anxiety are often accompanied by immediate physiological changes, such as increased cortisol levels.
From Raw Signals to Emotional Insight Within Minutes
The second core component of the project comes from Dr. Erez’s lab: a real-time data analysis system powered by machine learning.
“All of the neural and physiological data are fed into a computational platform that can assess mood within minutes,” explains Dr. Erez. “The system isn’t just identifying emotional states quickly, it’s designed to enable immediate clinical response.”
In practical terms, this means the technology could eventually support real-time emotional regulation, alert clinicians to sudden shifts in mental state, or trigger personalized interventions precisely when they’re needed.
Where Neuroscience Meets Engineering
Dr. Erez, a member of the Gonda Multidisciplinary Brain Research Center, specializes in neural signal processing, brain networks, and computational modeling. Her research focuses on decoding brain activity from various imaging methods and understanding how different brain regions interact to shape cognition and emotion. This work forms the foundation for new diagnostic and therapeutic tools in personalized medicine.
Dr. Keren, also affiliated with the Gonda Center, works at the intersection of neuroscience and engineering, using mathematical models and personalized experimental designs to study emotional variability between individuals. Her research seeks objective markers of emotional states and explores how these processes are disrupted in conditions such as depression.
Toward Real-Time, Personalized Mental Health Care
Together, the two researchers are building a platform that integrates computational neuroscience, brain–computer interfaces, advanced signal processing, and immersive technology. The goal is not only to deepen scientific understanding of how emotional states evolve over time, but also to lay the technological groundwork for a new generation of mental health tools.
In the future, systems based on this research could continuously monitor emotional well-being, detect early signs of distress, and deliver immediate, personalized support—bringing mental health care closer to the responsiveness and precision seen in other areas of medicine.
For patients and clinicians alike, that shift could be transformative.