Listening or lip-reading? It’s down to brainwaves
UNIGE researchers have discovered that neural oscillations determine whether the brain chooses eyes or ears to interpret speech.
Listening or lip-reading? Brainwaves are involved in this process. © UNIGE/Thézé
To decipher what a person is telling us, we rely on what we hear as well as on what we see by observing lip movements and facial expressions. Until now, we did not know how the brain chooses between auditory and visual stimuli. But a research group of the University of Geneva (UNIGE), funded by the Swiss National Science Foundation (SNSF) recently showed that neural oscillations (brainwaves) are involved in this process. More precisely, it is the phase of these waves – i.e. the point in the wave cycle just before a specific instant – that determines which sensory channel will contribute most to the perception of speech. The results of this study, led by neurologist Pierre Mégevand of the University of Geneva, have just been published in the journal Science Advances.
In conducting their study, Pierre Mégevand and his colleagues Raphaël Thézé and Anne-Lise Giraud used an innovative device based on audiovisual illusions. Subjects were placed in front of a screen on which a virtual character uttered phrases in French that could be misinterpreted, such as “Il n’y a rien à boire / Il n’y a rien à voir” (“There’s nothing to drink / There’s nothing to see” – an example in English would be: “The item was in the vase/base”). In some of the statements spoken by the virtual character, the researchers introduced a conflict between what the subjects saw and what they heard. For example, the character pronounced a “b”, but her lips formed a “v”. The subjects were asked to repeat the statement they had understood while electrodes recorded their brain’s electrical activity.
Audiovisual illusions
The researchers observed that when the auditory and visual information matched, the subjects repeated the correct statement most of the time. However, in the event of a conflict, subjects relied either on the auditory cue or the visual cue, depending. For example, when they heard a “v” but saw a “b”, the auditory cue dominated perception in about two-thirds of cases. In the opposite situation, the visual cue guided perception.
The sensory channel is determined in advance
The researchers compared these results with the brain’s electrical activity. They observed that about 300 milliseconds preceding agreement or conflict between the auditory and visual information, the phase of the brainwave in the posterior temporal and occipital cortex differed between subjects who had relied on the visual cue and those who had relied on the auditory cue.
“We have known since the 1970s that in certain situations, the brain seems to choose visual cues over auditory cues, and even more so when the auditory signal is impeded, for example when there is ambient noise. We can now show that brainwaves are involved in this process. However, their exact role is still a mystery,” says Mégevand.
4 Nov 2020