Crowdsourcing Quality Concerns: An Examination of Amazon’s Mechanical Turk

Dupuis, M., Renaud, K. and Searle, R. (2022) Crowdsourcing Quality Concerns: An Examination of Amazon’s Mechanical Turk. In: 23rd Annual Conference on Information Technology Education, Chicago, IL, USA, 21-24 Sep 2022, pp. 127-129. ISBN 9781450393911 (doi: 10.1145/3537674.3555783)

Full text not currently available from Enlighten.

Abstract

The use of crowdsourcing platforms, such as Amazon’s Mechanical Turk (MTurk), have been an effective and frequent tool for researchers to gather data from participants for a study. It provides a fast, efficient, and cost-effective method for acquiring large amounts of data for a variety of research projects, such as surveys that may be conducted to assess the use of information technology or to better understand cybersecurity perceptions and behaviors. While the use of such crowdsourcing platforms has gained both popularity and acceptance over the past several years, quality concerns remain a significant issue for the researcher. This paper examines these issues.

Item Type:Conference Proceedings
Status:Published
Refereed:Yes
Glasgow Author(s) Enlighten ID:Renaud, Professor Karen and Searle, Professor Rosalind
Authors: Dupuis, M., Renaud, K., and Searle, R.
College/School:College of Science and Engineering > School of Computing Science
College of Social Sciences > Adam Smith Business School > Management
Journal Name:The 23rd Annual Conference on Information Technology Education
Publisher:ACM
ISBN:9781450393911
Published Online:21 September 2022
Related URLs:

University Staff: Request a correction | Enlighten Editors: Update this record