Decoding sound and imagery content in early visual cortex

Vetter, P., Smith, F. W. and Muckli, L. (2014) Decoding sound and imagery content in early visual cortex. Current Biology, 24(11), pp. 1256-1262. (doi: 10.1016/j.cub.2014.04.020) (PMID:24856208) (PMCID:PMC4046224)

[img]
Preview
Text
93955.pdf - Published Version
Available under License Creative Commons Attribution.

1MB

Abstract

Human early visual cortex was traditionally thought to process simple visual features such as orientation, contrast, and spatial frequency via feedforward input from the lateral geniculate nucleus (e.g., [1]). However, the role of nonretinal influence on early visual cortex is so far insufficiently investigated despite much evidence that feedback connections greatly outnumber feedforward connections [2–5]. Here, we explored in five fMRI experiments how information originating from audition and imagery affects the brain activity patterns in early visual cortex in the absence of any feedforward visual stimulation. We show that category-specific information from both complex natural sounds and imagery can be read out from early visual cortex activity in blindfolded participants. The coding of nonretinal information in the activity patterns of early visual cortex is common across actual auditory perception and imagery and may be mediated by higher-level multisensory areas. Furthermore, this coding is robust to mild manipulations of attention and working memory but affected by orthogonal, cognitively demanding visuospatial processing. Crucially, the information fed down to early visual cortex is category specific and generalizes to sound exemplars of the same category, providing evidence for abstract information feedback rather than precise pictorial feedback. Our results suggest that early visual cortex receives nonretinal input from other brain areas when it is generated by auditory perception and/or imagery, and this input carries common abstract information. Our findings are compatible with feedback of predictive information to the earliest visual input level (e.g., [6]), in line with predictive coding models [7–10].

Item Type:Articles
Status:Published
Refereed:Yes
Glasgow Author(s) Enlighten ID:Muckli, Professor Lars and Vetter, Dr Petra and Smith, Dr Fraser
Authors: Vetter, P., Smith, F. W., and Muckli, L.
College/School:College of Medical Veterinary and Life Sciences > Institute of Neuroscience and Psychology
Journal Name:Current Biology
Publisher:Elsevier
ISSN:0960-9822
ISSN (Online):1879-0445
Copyright Holders:Copyright © 2014 The Authors
First Published:First published in Current Biology 24(11):1256-1262
Publisher Policy:Reproduced under a Creative Commons License

University Staff: Request a correction | Enlighten Editors: Update this record

Project CodeAward NoProject NamePrincipal InvestigatorFunder's NameFunder RefLead Dept
474481Brain processes predicting forthcoming perception - cortical feedback and visual predictionsLars MuckliBiotechnology and Biological Sciences Research Council (BBSRC)BB/G005044/1INP - CENTRE FOR COGNITIVE NEUROIMAGING