Effective and Efficient Training for Sequential Recommendation using Recency Sampling

Petrov, A. and Macdonald, C. (2022) Effective and Efficient Training for Sequential Recommendation using Recency Sampling. In: ACM Conference on Recommender Systems (RecSys 2022), Seattle, USA, 18-23 Sep 2022, pp. 81-91. ISBN 9781450392785 (doi: 10.1145/3523227.3546785)

[img] Text
273792.pdf - Accepted Version

884kB

Abstract

Many modern sequential recommender systems use deep neural networks, which can effectively estimate the relevance of items but require a lot of time to train. Slow training increases expenses, hinders product development timescales and prevents the model from being regularly updated to adapt to changing user preferences. Training such sequential models involves appropriately sampling past user interactions to create a realistic training objective. The existing training objectives have limitations. For instance, next item prediction never uses the beginning of the sequence as a learning target, thereby potentially discarding valuable data. On the other hand, the item masking used by BERT4Rec is only weakly related to the goal of the sequential recommendation; therefore, it requires much more time to obtain an effective model. Hence, we propose a novel Recency-based Sampling of Sequences training objective that addresses both limitations. We apply our method to various recent and state-of-the-art model architectures – such as GRU4Rec, Caser, and SASRec. We show that the models enhanced with our method can achieve performances exceeding or very close to state-of-the-art BERT4Rec, but with much less training time.

Item Type:Conference Proceedings
Status:Published
Refereed:Yes
Glasgow Author(s) Enlighten ID:Macdonald, Professor Craig and Petrov, Aleksandr
Authors: Petrov, A., and Macdonald, C.
College/School:College of Science and Engineering > School of Computing Science
ISBN:9781450392785
Copyright Holders:Copyright © 2022 Association for Computing Machinery
Publisher Policy:Reproduced in accordance with the copyright policy of the publisher

University Staff: Request a correction | Enlighten Editors: Update this record