Scientists use AI to reconstruct brain activity into speech – CNET

Brain, neural network, illustration
ALFRED PASIEKA/SCIENCE PHOTO LIB/Getty Images

Being able to think at a machine, who can then understand you or speak out loud for you, may be a reality in our near future.

Researchers from three teams used data obtained from brain tumor removal surgeries or through electrodes planted on epilepsy patients’ brains to pinpoint the origin of seizures, and trained an AI to interpret the brain activity data into speech, reportsĀ Science Magazine.

Before you get too excited though, computer models can only be trained on each individual, because the signals that translate speech apparently differ from person to person, and the extremely precise nature of the data required needs the skull to be opened.

Accuracy of the reconstructed speech ranged from 40 to 80 percent accuracy and understandability. However, none of the researchers have yet managed to figure out how to understand imagined speech — that is, the brain signals when a person silently speaks or hears a voice in their head.

One approach scientists could be considering is to meet in the middle, by having the subject listen to the computer-generated speech and adjusting their thoughts to get the desired results. Meanwhile the neural network could also be trained to understand how the subject thinks.

If successful, patients suffering from amyotrophic lateral sclerosis (ALS) could possibly have a new and easier way of communicating. Physicist Stephen Hawking, who passed away last year, had communicated through the use of his cheek muscle before he passed away, and may have been able to benefit from this breakthrough.