MEG activity in visual and auditory cortices represents acoustic speech-related information during silent lip reading

Bröhl, F., Keitel, A. and Kayser, C. (2022) MEG activity in visual and auditory cortices represents acoustic speech-related information during silent lip reading. eNeuro, 9(3), ENEURO.0209-22.2022. (doi: 10.1523/ENEURO.0209-22.2022) (PMID:35728955) (PMCID:PMC9239847)

[img] Text
275798.pdf - Published Version
Available under License Creative Commons Attribution.

1MB

Abstract

Speech is an intrinsically multisensory signal, and seeing the speaker’s lips forms a cornerstone of communication in acoustically impoverished environments. Still, it remains unclear how the brain exploits visual speech for comprehension. Previous work debated whether lip signals are mainly processed along the auditory pathways or whether the visual system directly implements speech-related processes. To probe this, we systematically characterized dynamic representations of multiple acoustic and visual speech-derived features in source localized MEG recordings that were obtained while participants listened to speech or viewed silent speech. Using a mutual-information framework we provide a comprehensive assessment of how well temporal and occipital cortices reflect the physically presented signals and unique aspects of acoustic features that were physically absent but may be critical for comprehension. Our results demonstrate that both cortices feature a functionally specific form of multisensory restoration: during lip reading, they reflect unheard acoustic features, independent of co-existing representations of the visible lip movements. This restoration emphasizes the unheard pitch signature in occipital cortex and the speech envelope in temporal cortex and is predictive of lip-reading performance. These findings suggest that when seeing the speaker’s lips, the brain engages both visual and auditory pathways to support comprehension by exploiting multisensory correspondences between lip movements and spectro-temporal acoustic cues.

Item Type:Articles
Additional Information:This work was supported by the United Kingdom Biotechnology and Biological Sciences Research Council (BBSRC) Grant BB/L027534/1 and the European Research Council ERC-2014-CoG Grant No 646657
Status:Published
Refereed:Yes
Glasgow Author(s) Enlighten ID:Kayser, Professor Christoph and Keitel, Dr Anne
Authors: Bröhl, F., Keitel, A., and Kayser, C.
College/School:College of Medical Veterinary and Life Sciences > School of Psychology & Neuroscience
Journal Name:eNeuro
Publisher:Society for Neuroscience
ISSN:2373-2822
ISSN (Online):2373-2822
Published Online:21 June 2022
Copyright Holders:Copyright © 2022 Bröhl et al.
First Published:First published in eNeuro 9(3): ENEURO.0209-22.2022
Publisher Policy:Reproduced under a Creative Commons License

University Staff: Request a correction | Enlighten Editors: Update this record

Project CodeAward NoProject NamePrincipal InvestigatorFunder's NameFunder RefLead Dept
170507Pathways and mechanisms underlying the visual enhancement of hearing in challenging environments.Philippe SchynsBiotechnology and Biological Sciences Research Council (BBSRC)BB/L027534/1Centre for Cognitive Neuroimaging
172114DynaSensChristoph KayserEuropean Research Council (ERC)646657Centre for Cognitive Neuroimaging