Evaluating novice and expert users on handheld video retrieval systems

Scott, D., Hopfgartner, F. , Guo, J. and Gurrin, C. (2013) Evaluating novice and expert users on handheld video retrieval systems. In: MMM'13: 19th International Conference on Multimedia Modeling, Huangshan, China, 7-9 Jan 2013, pp. 69-78. ISBN 9783642357275 (doi: 10.1007/978-3-642-35728-2_7)

Full text not currently available from Enlighten.

Publisher's URL: http://dx.doi.org/10.1007/978-3-642-35728-2_7

Abstract

Content-based video retrieval systems have been widely associated with desktop environments that are largely complex in nature, targeting expert users and often require complex queries. Due to this complexity, interaction with these systems can be a challenge for regular ”novice” users. In recent years, a shift can be observed from this traditional desktop environment to that of handheld devices, which requires a different approach to interacting with the user. In this paper, we evaluate the performance of a handheld content-based video retrieval system on both expert and novice users. We show that with this type of device, a simple and intuitive interface, which incorporates the principles of content-based systems, though hidden from the user, attains the same accuracy for both novice and desktop users when faced with complex information retrieval tasks. We describe an experiment which utilises the Apple iPad as our handheld medium in which both a group of experts and novice users run the interactive experiments from the 2010 TRECVid Known-Item Search task. The results indicate that a carefully defined interface can equalise the performance of both novice and expert users.

Item Type:Conference Proceedings
Status:Published
Refereed:Yes
Glasgow Author(s) Enlighten ID:Hopfgartner, Dr Frank
Authors: Scott, D., Hopfgartner, F., Guo, J., and Gurrin, C.
College/School:College of Arts & Humanities > School of Humanities > Information Studies
Publisher:Springer Verlag
ISBN:9783642357275

University Staff: Request a correction | Enlighten Editors: Update this record