Whether Listening or Reading, Your Brain Processes Words Similarly
In today’s technologically advanced world, people are taking advantage of the different ways to absorb information now more than ever. The way of old-fashioned reading is becoming a rare art with the hustle-and-bustle lifestyle people are starting to adapt to.
With the emergence of audiobooks, podcasts and audio texts, researchers are seeing that whether listening to or reading the same material, people process semantic information very similarly.
In the University of California, Berkeley’s latest brain mapping study, nine individuals listened to and read stories from “The Moth Radio Hour” podcast series. During each session, researchers scanned their brains using functional MRI machines and compared the brain activity while listening to the activity while reading.
The researchers then used statistical modeling to arrange thousands of words on maps according to their semantic relationships. For example, “bear,” “cat” and “fish” can all be found under the “animals” category. These maps, which covered about one-third of the cerebral cortex, enabled researchers to accurately predict which words would activate which parts of the brain. The results are visible in an interactive, 3D, color-coded map, where words are grouped into different categories, such as visual, tactile, numeric, locational, violent, mental, emotional, and social.
The brain activity results from listening and reading were surprisingly identical. Changes were expected in the way readers versus listeners processed semantic information, but the similarities in meaning representation were a surprise.
“We knew that a few brain regions were activated similarly when you hear a word and read the same word, but I was not expecting such strong similarities in the meaning representation across a large network of brain regions in both these sensory modalities,” says Fatma Deniz, a postdoctoral researcher in neuroscience in the Gallant Lab at University of California, Berkeley and former fellow with the Berkeley Institute for Data Science.
The semantic maps could reveal language processing for those with dyslexia, auditory processing disorders, epilepsy, stroke and other brain injuries that can impair speech. Children in school or adults who have struggled processing information their entire life can now explore how they best process and retain information.
If you have a client with any of these conditions, think about how you could open up their homework, or even the coaching session itself, to different ways of processing. For example, if your client has dyslexia, you may consider having them complete a homework assignment by listening and speaking rather than by reading and writing.