Suppose you could read someone’s thoughts. What would you do? Avoid arguments? Strike up new friendships? Or perhaps extract a story from someone’s mind?
The latter has now actually happened. Researchers at the University of Texas, in Austin, USA, used artificial intelligence (AI) to translate people’s brain activity into a text. And it turns out to be reasonably accurate.
This isn’t idle speculation: this is a published study in Nature Neuroscience.
The mind reader in question is an algorithm called a semantic decoder. It’s based on a language model similar to the one used by large-language model chatbots such as ChatGPT and Google’s Bard. Although the decoder did need to be trained before it was able to translate brain activity.
Each of the seven participants had to lie in an fMRI brain scanner for up to 16 hours while listening intently to podcasts. The AI thus learned to associate words and phrases with a particular brain activity.
Then came the test. The candidates had to lie down in the brain scanner again but this time they listened to a story. The semantic decoded then fabricated words based on the brain activity.
So, how well did it work? The results tended to be a literal representation of the story, but they were reasonably close. For example, “I don’t have my driver’s licence yet” translated as “She has not even started to learn to drive yet”. More examples below:
As ever with AI, ethical issues rear their head. What about privacy, for example? The researchers have a rebuttal to that. They say that you have to undergo intensive training before the computer can extract the right data from your head.
As such, it’s impossible to “read” the thoughts of a random passerby. Moreover, you can easily fool the computer by thinking about something else while listening to a story.
Making it portable
In short, at this point we don’t have to worry. But the researchers do want to develop the brain activity decoder further. For example, by making it portable (that can be done by using another brain imaging technique instead of fMRI) and training more hours so that it learns to associate even better.
Next, the decoder actually has applications. People who cannot speak (in cases of stroke or muscle disease ALS, for example) could still use it to communicate with others. Or you could control devices with your thoughts.
How AI mind-reading works
“The researchers cleverly use modern AI techniques to decode language from fMRI measurements,” said neurocomputer scientist Harm Brouwer of Tilburg University.
“Whereas previously these kinds of brain-computer interface models could only decode letter sequences, words or short phrases, this new method is able to extract relatively long pieces of text from brain activity.”
University of Twente neuroscientist Michel van Putten is also impressed with the study. “The researchers show for the first time that it is possible to ‘eavesdrop’ on language in a non-invasive way (i.e. without an implant).”
One of the challenges, he said, was the average word frequency of the English language: just over two words per second. “MRI measures down to periods of ten seconds. So the brain activity in that period is caused by as many as 20 words. All the more clever that the decoder still managed to produce a reasonable text. It did so by, among other things, generating the most likely word sequences.”
Applications of AI mind-reading
Sadly, applications for patients with communication problems are still a long way off, according to van Putten. “Partly because in these people the brain areas involved in language may function differently. But it is certainly interesting.”
Finally, Brewer likes that the team is also trying to find out if there are specific brain regions that contribute to successful decoding. “Despite finding no major differences, this perspective offers a much-needed breath of fresh air in the AI world of neuroscience.”
Sources: Nature Neuroscience, University of Texas at Austin via EurekAlert!
This article was originally published on KIJK Magazine and can be read in Dutch here.
Generative AI is about more than just automating sales and marketing. It’s about making it more personal, too.
Amazon and Microsoft trade blows over cloud competition
Chetna Gogia, Chief Human Resources Officer at GoKwik: “Go deep in acquiring the right knowledge before you advise on HR practices to management”
In this Coffee with HR interview, we speak to Chetna Gogia, Chief Human Resources Officer at GoKwik. She has over 20+ years of experience leading HR functions across various sectors