Using non-speech sounds to improve access to 2D tabular numerical information for visually impaired users

Ramloll, R., Yu, W., Riedel, B. and Brewster, S.A. (2001) Using non-speech sounds to improve access to 2D tabular numerical information for visually impaired users. In: 15th Annual Conference of the British HCI Group, Lille, France, 10-14 September 2001, pp. 515-529. ISBN 1852335157




We investigated two solutions for numerical (2D) tabular data discovery and overview for visually impaired and blind users. One involved accessing information in tables (26 rows x 10 columns containing integers between and including 0 and 100) by this target user group using both speech and non-speech sounds. The other involved accessing similar information in tables of the same size through speech only by the same user group. We found that opportunities to access data through non-speech sounds result in a highly significant decrease in the overall subjective workload, more specifically in the mental, temporal, performance, and frustration workload categories. This subjective workload assessment was supported by our quantitative results which showed a highly significant decrease in the average time taken to complete a given data comprehension task and a significant increase in the number of successuflly completed tasks.

Item Type:Conference Proceedings
Keywords:data visualisation, sound graphs, subjective workload assessment, non-speech sounds, 2D tables, speech output
Glasgow Author(s) Enlighten ID:Brewster, Professor Stephen
Authors: Ramloll, R., Yu, W., Riedel, B., and Brewster, S.A.
Subjects:Q Science > QA Mathematics > QA75 Electronic computers. Computer science
B Philosophy. Psychology. Religion > BF Psychology
College/School:College of Science and Engineering > School of Computing Science
Publisher:Springer Verlag
Copyright Holders:Copyright © 2001 Springer Verlag
First Published:First published in People and computers XV: interactions without frontiers: joint proceedings of HCI 2001 and IHM 2001
Publisher Policy:Reproduced in accordance with the copyright policy of the publisher

University Staff: Request a correction | Enlighten Editors: Update this record