Perceptually relevant speech tracking in auditory and motor cortex reflects distinct linguistic features

Keitel, A. , Gross, J. and Kayser, C. (2018) Perceptually relevant speech tracking in auditory and motor cortex reflects distinct linguistic features. PLoS Biology, 16(3), e2004473. (doi: 10.1371/journal.pbio.2004473) (PMID:29529019) (PMCID:PMC5864086)

158910.pdf - Published Version
Available under License Creative Commons Attribution.



During online speech processing, our brain tracks the acoustic fluctuations in speech at different timescales. Previous research has focused on generic timescales (for example, delta or theta bands) that are assumed to map onto linguistic features such as prosody or syllables. However, given the high intersubject variability in speaking patterns, such a generic association between the timescales of brain activity and speech properties can be ambiguous. Here, we analyse speech tracking in source-localised magnetoencephalographic data by directly focusing on timescales extracted from statistical regularities in our speech material. This revealed widespread significant tracking at the timescales of phrases (0.6–1.3 Hz), words (1.8–3 Hz), syllables (2.8–4.8 Hz), and phonemes (8–12.4 Hz). Importantly, when examining its perceptual relevance, we found stronger tracking for correctly comprehended trials in the left premotor (PM) cortex at the phrasal scale as well as in left middle temporal cortex at the word scale. Control analyses using generic bands confirmed that these effects were specific to the speech regularities in our stimuli. Furthermore, we found that the phase at the phrasal timescale coupled to power at beta frequency (13–30 Hz) in motor areas. This cross-frequency coupling presumably reflects top-down temporal prediction in ongoing speech perception. Together, our results reveal specific functional and perceptually relevant roles of distinct tracking and cross-frequency processes along the auditory–motor pathway.

Item Type:Articles
Glasgow Author(s) Enlighten ID:Keitel, Dr Anne and Kayser, Professor Christoph and Gross, Professor Joachim
Creator Roles:
Keitel, A.Conceptualization, Formal analysis, Investigation, Methodology, Validation, Visualization, Writing – original draft, Writing – review and editing
Gross, J.Funding acquisition, Supervision, Writing – review and editing
Kayser, C.Conceptualization, Formal analysis, Funding acquisition, Methodology, Project administration, Supervision, Writing – review and editing
Authors: Keitel, A., Gross, J., and Kayser, C.
College/School:College of Medical Veterinary and Life Sciences > School of Psychology & Neuroscience
Journal Name:PLoS Biology
Publisher:Public Library of Science
ISSN (Online):1545-7885
Copyright Holders:Copyright © 2018 Keitel et al.
First Published:First published in PLoS Biology 16(3): e2004473
Publisher Policy:Reproduced under a Creative Commons License
Data DOI:10.5061/dryad.1qq7050

University Staff: Request a correction | Enlighten Editors: Update this record

Project CodeAward NoProject NamePrincipal InvestigatorFunder's NameFunder RefLead Dept
658701Pathways and mechanisms underlying the visual enhancement of hearing in challenging environments.Christoph KayserBiotechnology and Biological Sciences Research Council (BBSRC)BB/L027534/1INP - CENTRE FOR COGNITIVE NEUROIMAGING
670551DynaSensChristoph KayserEuropean Research Council (ERC)646657INP - CENTRE FOR COGNITIVE NEUROIMAGING
597051Natural and modulated neural communication: State-dependent decoding and driving of human Brain Oscillations.Joachim GrossWellcome Trust (WELLCOTR)098433/Z/12/ZINP - CENTRE FOR COGNITIVE NEUROIMAGING