Contextual modulation of primary visual cortex by auditory signals

Petro, L.S. , Paton, A.T. and Muckli, L. (2017) Contextual modulation of primary visual cortex by auditory signals. Philosophical Transactions of the Royal Society B: Biological Sciences, 372(1714), 20160104. (doi:10.1098/rstb.2016.0104) (PMID:28044015) (PMCID:PMC5206272)

[img]
Preview
Text
134397.pdf - Published Version
Available under License Creative Commons Attribution.

688kB

Abstract

Early visual cortex receives non-feedforward input from lateral and top-down connections (Muckli & Petro 2013 Curr. Opin. Neurobiol. 23, 195–201. (doi:10.1016/j.conb.2013.01.020)), including long-range projections from auditory areas. Early visual cortex can code for high-level auditory information, with neural patterns representing natural sound stimulation (Vetter et al. 2014 Curr. Biol. 24, 1256–1262. (doi:10.1016/j.cub.2014.04.020)). We discuss a number of questions arising from these findings. What is the adaptive function of bimodal representations in visual cortex? What type of information projects from auditory to visual cortex? What are the anatomical constraints of auditory information in V1, for example, periphery versus fovea, superficial versus deep cortical layers? Is there a putative neural mechanism we can infer from human neuroimaging data and recent theoretical accounts of cortex? We also present data showing we can read out high-level auditory information from the activation patterns of early visual cortex even when visual cortex receives simple visual stimulation, suggesting independent channels for visual and auditory signals in V1. We speculate which cellular mechanisms allow V1 to be contextually modulated by auditory input to facilitate perception, cognition and behaviour. Beyond cortical feedback that facilitates perception, we argue that there is also feedback serving counterfactual processing during imagery, dreaming and mind wandering, which is not relevant for immediate perception but for behaviour and cognition over a longer time frame. This article is part of the themed issue ‘Auditory and visual scene analysis’.

Item Type:Articles
Status:Published
Refereed:Yes
Glasgow Author(s) Enlighten ID:PATON, Angus and Petro, Dr Lucy and Muckli, Professor Lars
Authors: Petro, L.S., Paton, A.T., and Muckli, L.
College/School:College of Medical Veterinary and Life Sciences > Institute of Neuroscience and Psychology
Journal Name:Philosophical Transactions of the Royal Society B: Biological Sciences
Publisher:The Royal Society
ISSN:0962-8436
ISSN (Online):1471-2970
Published Online:02 January 2017
Copyright Holders:Copyright © 2017 The Authors
First Published:First published in Philosophical Transactions of the Royal Society B: Biological Sciences 372(1714): 20160104
Publisher Policy:Reproduced under a Creative Commons License

University Staff: Request a correction | Enlighten Editors: Update this record

Project CodeAward NoProject NamePrincipal InvestigatorFunder's NameFunder RefLead Dept
593891Brain reading of contextual feedback and predictions - BrainReadFBPredCodeLars MuckliEuropean Research Council (ERC)ERC-2012-Stg-311751INP - CENTRE FOR COGNITIVE NEUROIMAGING