Designing Interactions with Multilevel Auditory Displays in Mobile Audio-Augmented Reality

Vazquez-Alvarez, Y., Aylett, M. P., Brewster, S. A. , von Jungenfeld, R. and Virolainen, A. (2016) Designing Interactions with Multilevel Auditory Displays in Mobile Audio-Augmented Reality. ACM Transactions on Computer-Human Interaction, 23(1), 3. (doi:10.1145/2829944)

[img]
Preview
Text
110776.pdf - Accepted Version

5MB

Abstract

Auditory interfaces offer a solution to the problem of effective eyes-free mobile interactions. In this article, we investigate the use of multilevel auditory displays to enable eyes-free mobile interaction with indoor location-based information in non-guided audio-augmented environments. A top-level exocentric sonification layer advertises information in a gallery-like space. A secondary interactive layer is used to evaluate three different conditions that varied in the presentation (sequential versus simultaneous) and spatialisation (non-spatialised versus egocentric/exocentric spatialisation) of multiple auditory sources. Our findings show that (1) participants spent significantly more time interacting with spatialised displays; (2) using the same design for primary and interactive secondary display (simultaneous exocentric) showed a negative impact on the user experience, an increase in workload and substantially increased participant movement; and (3) the other spatial interactive secondary display designs (simultaneous egocentric, sequential egocentric, and sequential exocentric) showed an increase in time spent stationary but no negative impact on the user experience, suggesting a more exploratory experience. A follow-up qualitative and quantitative analysis of user behaviour support these conclusions. These results provide practical guidelines for designing effective eyes-free interactions for far richer auditory soundscapes.

Item Type:Articles
Status:Published
Refereed:Yes
Glasgow Author(s) Enlighten ID:Vazquez-Alvarez, Dr Yolanda and Brewster, Professor Stephen
Authors: Vazquez-Alvarez, Y., Aylett, M. P., Brewster, S. A., von Jungenfeld, R., and Virolainen, A.
College/School:College of Science and Engineering > School of Computing Science
Journal Name:ACM Transactions on Computer-Human Interaction
Publisher:ACM Press
ISSN:1073-0516
ISSN (Online):1557-7325
Copyright Holders:Copyright © 2016 ACM
First Published:First published in ACM Transactions on Computer-Human Interaction 23(1):3
Publisher Policy:Reproduced in accordance with the copyright policy of the publisher

University Staff: Request a correction | Enlighten Editors: Update this record

Project CodeAward NoProject NamePrincipal InvestigatorFunder's NameFunder RefLead Dept
457653GAIME-gestural and audio interactions for mobile environmentsStephen BrewsterEngineering & Physical Sciences Research Council (EPSRC)EP/F023405/1COM - COMPUTING SCIENCE