AI Algorithm—Real-time Decoding for Neurotechnology
There is a major push for using AI in real-time brain-computer interfaces. To that end, researchers at the USC Center for Neurotechnology have developed an advanced deep-learning method for brain signals that can perform real-time decoding to advance neurotechnology. The research will be published in Nature Biomedical Engineering.
The research focuses on decoding brain signals to develop brain-computer interfaces (BCIs) to treat neurological and mental health challenges. For example, moving a robotic arm in real time for a paralyzed patient by decoding the movement the patient is thinking about based on their brain signals. BCI can also interpret mood symptoms of a patient with major depression from brain signals and deliver the correct dosage of deep-brain stimulation therapy.
Until now, BCIs have relied on simpler computing algorithms. The models had good accuracy, but could not accurately decode in real time, or work with missing brain data. To become seamlessly applicable to real-time BCIs, they must address accuracy, efficiency, and speed. They also must handle randomly missing brain signals, which can occur when transmitting signals in wireless BCIs. The new deep-learning approach for brain signals is called DFINE, for “dynamical flexible inference for nonlinear embeddings.” Thus far, DFINE accurately decodes brain signals in real-time, even for wireless BCI transmission.
According to the team, the work provides advanced deep-learning methods usable in real-world neurotechnology. They simultaneously offer accuracy, real-time operation, flexibility, and efficiency. In the future, BCIs will be quicker, more precise, and more responsive, enhancing therapeutic devices for people with neurological or even mental health conditions.