Evaluation in the crowd: an introduction

Archambault, D., Purchase, H. C. and Hoßfeld, T. (2017) Evaluation in the crowd: an introduction. In: Archambault, D., Purchase, H. and Hoßfeld, T. (eds.) Evaluation in the Crowd: Crowdsourcing and Human-Centered Experiments. Series: Lecture notes in computer science (10264). Springer: Cham, pp. 1-5. ISBN 9783319664347 (doi: 10.1007/978-3-319-66435-4_1)

Full text not currently available from Enlighten.


Human-centred empirical evaluations play important roles in the fields of human-computer interaction, visualisation, and graphics. The advent of crowdsourcing platforms, such as Amazon Mechanical Turk, has provided a revolutionary methodology to conduct human-centred experiments. Through such platforms, experiments can now collect data from hundreds, even thousands, of participants from a diverse user community over a matter of weeks, greatly increasing the ease with which we can collect data as well as the power and generalisability of experimental results. However, such an experimental platform does not come without its problems: ensuring participant investment in the task, defining experimental controls, and understanding the ethics behind deploying such experiments en masse. This book is intended to be a primer for computer science researchers who intend to use crowdsourcing technology for human centred experiments. It focuses on methodological considerations when using crowdsourcing platforms to run human-centred experiments, particularly in the areas of visualisation and of quality of experience (QoE) for online video delivery. We hope that this book can act as a primer to researchers in our fields that intend to run experiments on crowdsourcing for the purposes of human-centred experimentation.

Item Type:Book Sections (Introduction)
Glasgow Author(s) Enlighten ID:Purchase, Dr Helen
Authors: Archambault, D., Purchase, H. C., and Hoßfeld, T.
College/School:College of Science and Engineering > School of Computing Science
Published Online:28 September 2017

University Staff: Request a correction | Enlighten Editors: Update this record