AI is helping researchers to ‘read’ words and phrases in the minds of test subjects in an effort to help disabled people, such as those who have lost the ability to speak from ALS, to be able to communicate. (Source: Adobe Stock)

AI Aids in Reading Words Registered in Brains Scanned Via FMRI

According to a story on MIT’s Technology Review, a noninvasive brain-computer interface read specific words from the minds of research subjects who were listening to podcasts. This could help people who are unable to speak to be able to communicate again.

“In a new study, published in Nature Neuroscience today, a model trained on functional magnetic resonance imaging scans of three volunteers was able to predict whole sentences they were hearing with surprising accuracy—just by looking at their brain activity. The findings demonstrate the need for future policies to protect our brain data, the team says.”

The noninvasive brain recordings were collected through fMRI by a research team at the University of Texas at Austin. While normal MRI takes pictures of the structure of the brain, functional MRI scans evaluate blood flow in the brain, depicting which parts are activated by certain activities.

The team trained GPT-1, a large language model developed by OpenAI, on English sentences from Reddit, 240 stories from The Moth Radio Hour, and transcriptions of the New York Times’s Modern Love podcast, with the idea that entertaining content would be better to engage the test subjects. Then three participants each listened to 16 hours of different episodes of the same podcasts while in an MRI scanner, along with some TED talks. The team collected data more than five times larger than the language data sets typically used in language-related fMRI experiments.

“The model learned to predict the brain activity that reading certain words would trigger. To decode, it guessed sequences of words and checked how closely that guess resembled the actual words. It predicted how the brain would respond to the guessed words, and then compared that with the actual measured brain responses.”

When the team used the database for new podcasts, the AI model that serves as a brain scan decoder was able to determine what users were hearing just from their brain activity, often identifying exact words and phrases.

read more at technologyreview.com