Audio-visual synchrony and spatial attention enhance processing of dynamic visual stimulation independently and in parallel: A frequency-tagging study

Covic, A., Keitel, C. , Porcu, E., Schröger, E. and Müller, M. M. (2017) Audio-visual synchrony and spatial attention enhance processing of dynamic visual stimulation independently and in parallel: A frequency-tagging study. NeuroImage, 161, pp. 32-42. (doi: 10.1016/j.neuroimage.2017.08.022) (PMID:28802870)

[img]
Preview
Text
145770.pdf - Accepted Version

5MB

Abstract

The neural processing of a visual stimulus can be facilitated by attending to its position or by a co-occurring auditory tone. Using frequency-tagging we investigated whether facilitation by spatial attention and audio-visual synchrony rely on similar neural processes. Participants attended to one of two flickering Gabor patches (14.17 and 17 Hz) located in opposite lower visual fields. Gabor patches further “pulsed” (i.e. showed smooth spatial frequency variations) at distinct rates (3.14 and 3.63 Hz). Frequency-modulating an auditory stimulus at the pulse-rate of one of the visual stimuli established audio-visual synchrony. Flicker and pulsed stimulation elicited stimulus-locked rhythmic electrophysiological brain responses that allowed tracking the neural processing of simultaneously presented stimuli. These steady-state responses (SSRs) were quantified in the spectral domain to examine visual stimulus processing under conditions of synchronous vs. asynchronous tone presentation and when respective stimulus positions were attended vs. unattended. Strikingly, unique patterns of effects on pulse- and flicker driven SSRs indicated that spatial attention and audiovisual synchrony facilitated early visual processing in parallel and via different cortical processes. We found attention effects to resemble the classical top-down gain effect facilitating both, flicker and pulse-driven SSRs. Audio-visual synchrony, in turn, only amplified synchrony-producing stimulus aspects (i.e. pulse-driven SSRs) possibly highlighting the role of temporally co-occurring sights and sounds in bottom-up multisensory integration.

Item Type:Articles
Status:Published
Refereed:Yes
Glasgow Author(s) Enlighten ID:Keitel, Dr Christian
Authors: Covic, A., Keitel, C., Porcu, E., Schröger, E., and Müller, M. M.
College/School:College of Medical Veterinary and Life Sciences > School of Psychology & Neuroscience
Journal Name:NeuroImage
Publisher:Elsevier
ISSN:1053-8119
ISSN (Online):1095-9572|
Published Online:09 August 2017
Copyright Holders:Copyright © 2017 Elsevier
First Published:First published in NeuroImage 161: 32-42
Publisher Policy:Reproduced in accordance with the copyright policy of the publisher

University Staff: Request a correction | Enlighten Editors: Update this record