CLICK ON weeks 0 - 40 and follow along every 2 weeks of fetal development
Scientists map brain to decode inner thoughts
What if a map of the brain could help us decode people's inner thoughts?
Scientists at the University of California, Berkeley, have taken a step in that direction by building a "semantic atlas" that shows in vivid colors and multiple dimensions how the human brain organizes language. The atlas identifies brain areas that respond to words that have similar meanings.
The work was funded by the National Science Foundation (NSF). Volunteers brains were scanned while listening to stories from The Moth Radio Hour a Public Broadcasting radio show in which people recount humorous and poignant autobiographical stories. Results of the scans show that at least 1/3 of each participant's cerebral cortex — including areas dedicated to high-level thoughts (cognition) — were involved in processing language heard from the program.
Notably, the study found different people have similar language maps."The similarity in semantic topography across different subjects is really surprising," said study lead author Alex Huth, a postdoctoral researcher in neuroscience at UC Berkeley.
Detailed maps show that our
The study might eventually help give voice to those who cannot speak, stroke suffers, people with brain damage or motor neuron diseases like ALS which restrict muscle flexibility. Charting language organization in the brain helps decode and bring these patients' inner dialogue a step closer to reality.
Clinicians hope someday to track the brain activity of patients who have difficulty communicating, and match their data to semantic language maps to determine what they are trying to express. Or perhaps a decoder will ensue which translates what you say into another language as you speak.
"To be able to map out semantic representations at this level of detail is a stunning accomplishment," said Kenneth Whang, a program director in the NSF Information and Intelligent Systems division. "In addition, they are showing how data-driven computational methods can help us understand the richness and complexity of the brain that we associate with human cognitive processes."
This data was then matched against the same stories that had been time-coded and phoneme transcribed. Phonemes are units of sound that distinguish one word from another. That information was fed into an algorithm that scored words according to how closely they are related in either meaning of a word, or a phrase, or in a sentence — semantically. Semantic results were converted into a map that arranged words on an image of the flattened cortices of the left and right hemispheres of the brain.
Added Huth: "Our semantic models are good at predicting responses to language in several big swaths of cortex. But, we also get fine-grained information that tells us what kind of data is represented in each brain area. That's why these maps are so exciting and hold so much potential."
Senior author Jack Gallant, a UC Berkeley neuroscientist, said that although the maps are broadly consistent across individuals, "There are also substantial individual differences. We will need to conduct further studies across a larger, more diverse sample of people before we will be able to map these individual differences in detail."
In addition to Huth and Gallant, co-authors of the paper are Wendy de Heer, Frederic Theunissen and Thomas Griffiths, all at UC Berkeley.
This NSF-funded project is an example of how NSF invests in the frontiers of brain research
Image Credit: University of California, Berkeley/Nature Video