Generative Relevance Feedback with Large Language Models

Mackie, I., Chatterjee, S. and Dalton, J. (2023) Generative Relevance Feedback with Large Language Models. In: SIGIR 2023: 46th International ACM SIGIR Conference on Research and Development in Information Retrieval, Taipei, Taiwan, 23-27 Jul 2023, pp. 2026-2031. ISBN 9781450394086 (doi: 10.1145/3539618.3591992)

[img] Text
297215.pdf - Published Version
Available under License Creative Commons Attribution.

834kB

Abstract

Current query expansion models use pseudo-relevance feedback to improve first-pass retrieval effectiveness; however, this fails when the initial results are not relevant. Instead of building a language model from retrieved results, we propose Generative Relevance Feedback (GRF) that builds probabilistic feedback models from long-form text generated from Large Language Models. We study the effective methods for generating text by varying the zero-shot generation subtasks: queries, entities, facts, news articles, documents, and essays. We evaluate GRF on document retrieval benchmarks covering a diverse set of queries and document collections, and the results show that GRF methods significantly outperform previous PRF methods. Specifically, we improve MAP between 5-19% and NDCG@10 17-24% compared to RM3 expansion, and achieve state-of-the-art recall across all datasets.

Item Type:Conference Proceedings
Additional Information:This work is supported by the 2019 Bloomberg Data Science Research Grant and the Engineering and Physical Sciences Research Council grant EP/V025708/1.
Status:Published
Refereed:Yes
Glasgow Author(s) Enlighten ID:Chatterjee, Dr Shubham and Mackie, Iain and Dalton, Dr Jeff
Authors: Mackie, I., Chatterjee, S., and Dalton, J.
College/School:College of Science and Engineering > School of Computing Science
ISBN:9781450394086
Copyright Holders:Copyright © 2023 The Authors
First Published:First published in Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR23), pp. 2026–2031
Publisher Policy:Reproduced with the permission of the publisher
Related URLs:

University Staff: Request a correction | Enlighten Editors: Update this record

Project CodeAward NoProject NamePrincipal InvestigatorFunder's NameFunder RefLead Dept
310549Dalton-UKRI-Turing FellowJeff DaltonEngineering and Physical Sciences Research Council (EPSRC)EP/V025708/1Computing Science