Mccreadie, R. , Macdonald, C. and Ounis, I. (2013) Identifying top news using crowdsourcing. Information Retrieval, 16(2), pp. 179-209. (doi: 10.1007/s10791-012-9186-z)
Full text not currently available from Enlighten.
Abstract
The influential Text REtrieval Conference (TREC) retrieval conference has always relied upon specialist assessors or occasionally participating groups to create relevance judgements for the tracks that it runs. Recently however, crowdsourcing has been championed as a cheap, fast and effective alternative to traditional TREC-like assessments. In 2010, TREC tracks experimented with crowdsourcing for the very first time. In this paper, we report our successful experience in creating relevance assessments for the TREC Blog track 2010 top news stories task using crowdsourcing. In particular, we crowdsourced both real-time newsworthiness assessments for news stories as well as traditional relevance assessments for blog posts. We conclude that crowdsourcing not only appears to be a feasible, but also cheap and fast means to generate relevance assessments. Furthermore, we detail our experiences running the crowdsourced evaluation of the TREC Blog track, discuss the lessons learned, and provide best practices.
Item Type: | Articles |
---|---|
Status: | Published |
Refereed: | Yes |
Glasgow Author(s) Enlighten ID: | Mccreadie, Dr Richard and Macdonald, Professor Craig and Ounis, Professor Iadh |
Authors: | Mccreadie, R., Macdonald, C., and Ounis, I. |
Subjects: | Q Science > QA Mathematics > QA75 Electronic computers. Computer science |
College/School: | College of Science and Engineering > School of Computing Science |
Journal Name: | Information Retrieval |
ISSN: | 1386-4564 |
ISSN (Online): | 1573-7659 |
Published Online: | 17 February 2013 |
University Staff: Request a correction | Enlighten Editors: Update this record