Zhou, K., Lalmas, M., Sakai, T., Cummins, R. and Jose, J. M. (2013) On the Reliability and Intuitiveness of Aggregated Search Metrics. In: 22nd ACM International Conference on Information and Knowledge Management, San Francisco, CA, USA, 27 Oct - 01 Nov 2013, pp. 689-698. ISBN 9781450322638 (doi: 10.1145/2505515.2505691)
Full text not currently available from Enlighten.
Abstract
Aggregating search results from a variety of diverse verticals such as news, images, videos and Wikipedia into a single interface is a popular web search presentation paradigm. Although several aggregated search (AS) metrics have been proposed to evaluate AS result pages, their properties remain poorly understood. In this paper, we compare the properties of existing AS metrics under the assumptions that (1) queries may have multiple preferred verticals; (2) the likelihood of each vertical preference is available; and (3) the topical relevance assessments of results returned from each vertical is available. We compare a wide range of AS metrics on two test collections. Our main criteria of comparison are (1) discriminative power, which represents the reliability of a metric in comparing the performance of systems, and (2) intuitiveness, which represents how well a metric captures the various key aspects to be measured (i.e. various aspects of a user's perception of AS result pages). Our study shows that the AS metrics that capture key AS components (e.g., vertical selection) have several advantages over other metrics. This work sheds new lights on the further developments and applications of AS metrics.
Item Type: | Conference Proceedings |
---|---|
Status: | Published |
Refereed: | Yes |
Glasgow Author(s) Enlighten ID: | Jose, Professor Joemon and Zhou, Mr Ke and Lalmas, Professor Mounia and Cummins, Dr Ronan |
Authors: | Zhou, K., Lalmas, M., Sakai, T., Cummins, R., and Jose, J. M. |
College/School: | College of Science and Engineering > School of Computing Science |
ISBN: | 9781450322638 |
University Staff: Request a correction | Enlighten Editors: Update this record