Can a Brain-Computer Interface Convert your Thoughts to Text?

Ever wonder what it would be like if a device could decode your thoughts into actual speech or written words? While this might enhance the capabilities of already existing speech interfaces with devices, it could be a potential game-changer for those with speech pathologies, and even more so for "locked-in" patients who lack any speech or motor function.

"So instead of saying 'Siri, what is the weather like today' or 'Ok Google, where can I go for lunch?' I just imagine saying these things," explains Christian Herff, author of a review recently published in the journal Frontiers in Human Neuroscience.

While reading one's thoughts might still belong to the realms of science fiction, scientists are already decoding speech from signals generated in our brains when we speak or listen to speech.

In their review, Herff and co-author, Dr. Tanja Schultz, compare the pros and cons of using various brain imaging techniques to capture neural signals from the brain and then decode them to text.

The technologies include functional MRI and near infrared imaging that can detect neural signals based on metabolic activity of neurons, to methods such as EEG and magnetoencephalography (MEG) that can detect electromagnetic activity of neurons responding to speech. One method in particular, called electrocorticography or ECoG, showed promise in Herff's study.

This study presents the Brain-to-text system in which epilepsy patients who already had electrode grids implanted for treatment of their condition participated. They read out texts presented on a screen in front of them while their brain activity was recorded. This formed the basis of a database of patterns of neural signals that could now be matched to speech elements or "phones".

When the researchers also included language and dictionary models in their algorithms, they were able to decode neural signals to text with a high degree of accuracy. "For the first time, we could show that brain activity can be decoded specifically enough to use ASR technology on brain signals," says Herff. "However, the current need for implanted electrodes renders it far from usable in day-to-day life."

So, where does the field go from here to a functioning thought detection device? "A first milestone would be to actually decode imagined phrases from brain activity, but a lot of technical issues need to be solved for that," concedes Herff.

Their study results, while exciting, are still only a preliminary step towards this type of brain-computer interface.

Herff C, Schultz T.
Automatic Speech Recognition from Neural Signals: A Focused Review.
Front Neurosci. 2016 Sep 27;10:429 DOI: 10.3389/fnins.2016.00429

Most Popular Now

Do Fitness Apps do More Harm than Good?

A study published in the British Journal of Health Psychology reveals the negative behavioral and psychological consequences of commercial fitness apps reported by users on social media. These impacts may...

Making Cancer Vaccines More Personal

In a new study, University of Arizona researchers created a model for cutaneous squamous cell carcinoma, a type of skin cancer, and identified two mutated tumor proteins, or neoantigens, that...

AI Tool Beats Humans at Detecting Parasi…

Scientists at ARUP Laboratories have developed an artificial intelligence (AI) tool that detects intestinal parasites in stool samples more quickly and accurately than traditional methods, potentially transforming how labs diagnose...

AI can Better Predict Future Risk for He…

A landmark study led by University' experts has shown that artificial intelligence can better predict how doctors should treat patients following a heart attack. The study, conducted by an international...

A New AI Model Improves the Prediction o…

Breast cancer is the most commonly diagnosed form of cancer in the world among women, with more than 2.3 million cases a year, and continues to be one of the...