Â鶹´«Ã½

News Release

Brain monitoring takes a leap out of the lab

Click Here for a HighResolution Version
Tim Mullen, left, and Mike Yu Chi are the lead researchers on the study. Both are Â鶹´«Ã½ alumni. Mullen cofounded Qusp, a start up focused on analytics, and Chi cofounded Cognionics, which developed the EEG headset featured in the study.

San Diego, Calif., Jan. 12, 2016 -- Bioengineers and cognitive scientists have developed the first portable, 64-channel wearable brain activity monitoring system that’s comparable to state-of-the-art equipment found in research laboratories.

The system is a better fit for real-world applications because it is equipped with dry EEG sensors that are easier to apply than wet sensors, while still providing high-density brain activity data. The system comprises a 64-channel dry-electrode wearable EEG headset and a sophisticated software suite for data interpretation and analysis. It has a wide range of applications, from research, to neuro-feedback, to clinical diagnostics.

The researchers’ goal is to get EEG out of the laboratory setting, where it is currently confined by wet EEG methods. In the future, scientists envision a world where neuroimaging systems work with mobile sensors and smart phones to track brain states throughout the day and augment the brain’s capabilities.

“This is going to take neuroimaging to the next level by deploying on a much larger scale,” said Mike Yu Chi, a Jacobs School alumnus and CTO of Cognionics who led the team that developed the headset used in the study. “You will be able to work in subjects’ homes. You can put this on someone driving.”

The researchers from the Jacobs School of Engineering and Institute for Neural Computation at Â鶹´«Ã½ detailed their findings in an article of the Special Issue on Wearable Technologies published recently in IEEE Transactions on Biomedical Engineering.

They also envision a future when neuroimaging can be used to bring about new therapies for neurological disorders. “We will be able to prompt the brain to fix its own problems,” said Gert Cauwenberghs, a bioengineering professor at the Jacobs School and a principal investigator of the research supported in part by a five-year Emerging Frontiers of Research Innovation grant from the National Science Foundation. “We are trying to get away from invasive technologies, such as deep brain stimulation and prescription medications, and instead start up a repair process by using the brain’s synaptic plasticity.”

In 10 years, using a brain-machine interface might become as natural as using your smartphone is today, said Tim Mullen, a Â鶹´«Ã½ alumnus, now CEO of Qusp and lead author on the study. Mullen, a former researcher at the Swartz Center for Computational Neuroscience at Â鶹´«Ã½, led the team that developed the software used in the study with partial funding from the Army Research Lab.

For this vision of the future to become a reality, sensors will need to become not only wearable but also comfortable, and algorithms for data analysis will need to be able to cut through noise to extract meaningful data. The paper, titled “Real-time Neuroimaging and Cognitive Monitoring Using Wearable Dry EEG,” outlines some significant first steps in that direction.

EGG headset

Click Here for a HighResolution Version
The headset features 64 channels for EEG monitoring. 

The EEG headset developed by Chi and his team has an octopus-like shape, in which each arm is elastic, so that it fits on many different kinds of head shapes. The sensors at the end of each arm are designed to make optimal contact with the scalp while adding as little noise in the signal as possible.

Researchers spent four years perfecting the recipe for the sensors’ materials. Sensors designed to work on a subject’s hair are made of a mix of silver and carbon deposited on a flexible substrate. This material allows sensors to remain flexible and durable while still conducting high-quality signals—a silver/silver-chloride coating is key here. Sensors designed to work on bare skin are made from a hydrogel encased inside a conductive membrane. These sensors are mounted inside a pod equipped with an amplifier, which helps boost signal quality while shielding the sensors from interferences from electrical equipment and other electronics.

Next steps include improving the headset’s performance while subjects are moving. The device can reliably capture signals while subjects walk but less so during more strenuous activities such as running. Electronics also need improvement to function for longer time periods—days and even weeks instead of hours.

Software and data analysis

Click Here for a HighResolution Version
Sensors designed to work on a subject’s hair are made of a mix of silver and carbon deposited on a flexible substrate. This material allows sensors to remain flexible and durable while still conducting high-quality signals—a silver/silver-chloride coating is key here.

The data that the headset captured were analyzed with software developed by Mullen and Christian Kothe, another former researcher at the Swartz Center for Computational Neuroscience and currently CTO of Qusp.  First, brain signals needed to be separated from noise in the EEG data. The tiny electrical currents originating from the brain are often contaminated by high amplitude artifacts generated when subjects move, speak or even blink. The researchers designed an algorithm that separates the EEG data in real-time into different components that are statistically unrelated to one another. It then compared these elements with clean data obtained, for instance, when a subject is at rest. Abnormal data were labeled as noise and discarded. “The algorithm attempts to remove as much of the noise as possible while preserving as much of the brain signal as possible,” said Mullen.

But the analysis didn’t stop there. Researchers used information about the brain’s known anatomy and the data they collected to find out where the signals come from inside the brain. They also were able to track, in real time, how signals from different areas of the brain interact with one another, building an ever-changing network map of brain activity. They then used machine learning to connect specific network patterns in brain activity to cognition and behavior.

“A Holy Grail in our field is to track meaningful changes in distributed brain networks at the ‘speed of thought’,” Mullen said. “We’re closer to that goal, but we’re not quite there yet.”

Start-ups

Click Here for a HighResolution Version
Cognionics also developed the Quick-20, a headset that can be applied faster and is easier to use but only offers 20-channels (the clinical standard 10/20 system).

Both Chi and Mullen have created start-ups focused on commercialization of brain technology, including some components featured in this study. Chi’s company, Cognionics, sells the headset to research groups. The device also is popular with specialists in neuro-feedback, who map the brain to later influence behavior. The ultimate goal is to get the headset into the clinic to help diagnose a range of conditions, such as strokes and seizures.

Mullen’s start-up, Qusp, has developed NeuroScale, a cloud-based software platform that provides continuous real-time interpretation of brain and body signals through an Internet application program interface. The goal is to enable brain-computer interface and advanced signal processing methods to be easily integrated with various everyday applications and wearable devices.

Under joint DARPA funding, Cognionics is creating an improved EEG system, while Qusp is developing an easy-to-use graphical software environment for rapid design and application of brain signal analysis pipelines.

”These entrepreneurial efforts  are integral to the success of the Jacobs School and the Institute for Neural Computation to help take neurotechnology from the lab to practical uses in cognitive and clinical applications,” said Cauwenberghs, who is co-founder of Cognionics and serves on its Scientific Advisory Board.

Scott Makeig and Tzyy-Ping Jung, director and co-director at the Swartz Center for Computational Neuroscience at the Institute of Neural Computation at Â鶹´«Ã½, are co-authors of the study. Mullen was also affiliated with the Center, as were co-authors Kothe and Alejandro Ojeda until they joined Qusp full time.  Co-author Trevor Kerth is now pursuing industrial design at Kingston University, London.

More info:

Full paper:

 

 

 

Media Contacts

Ioana Patringenaru
Jacobs School of Engineering
858-822-0899
ipatrin@ucsd.edu