One Person, One Model, One World: Learning Continual User Representation Without Forgetting

Yuan, F., Zhang, G., Karatzoglou, A., Jose, J. , Kong, B. and Li, Y. (2021) One Person, One Model, One World: Learning Continual User Representation Without Forgetting. In: 44th International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR '21), 11-15 July 2021, pp. 696-705. ISBN 9781450380379 (doi: 10.1145/3404835.3462884)

[img] Text
248187.pdf - Accepted Version

1MB

Abstract

Learning user representations is a vital technique toward effective user modeling and personalized recommender systems. Existing approaches often derive an individual set of model parameters for each task by training on separate data. However, the representation of the same user potentially has some commonalities, such as preference and personality, even in different tasks. As such, these separately trained representations could be suboptimal in performance as well as inefficient in terms of parameter sharing. In this paper, we delve on research to continually learn user representations task by task, whereby new tasks are learned while using partial parameters from old ones. A new problem arises since when new tasks are trained, previously learned parameters are very likely to be modified, and as a result, an artificial neural network (ANN)-based model may lose its capacity to serve for well-trained previous tasks forever, this issue is termed catastrophic forgetting. To address this issue, we present Conure the first continual, or lifelong, user representation learner --- i.e., learning new tasks over time without forgetting old ones. Specifically, we propose iteratively removing less important weights of old tasks in a deep user representation model, motivated by the fact that neural network models are usually over-parameterized. In this way, we could learn many tasks with a single model by reusing the important weights, and modifying the less important weights to adapt to new tasks. We conduct extensive experiments on two real-world datasets with nine tasks and show that Conure largely exceeds the standard model that does not purposely preserve such old "knowledge'', and performs competitively or sometimes better than models which are trained either individually for each task or simultaneously by merging all task data.

Item Type:Conference Proceedings
Additional Information:Published in Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR ’21), July 11–15, 2021, Virtual Event, Canada. ACM, New York, NY, USA, pp. 696-705.
Status:Published
Refereed:Yes
Glasgow Author(s) Enlighten ID:Jose, Professor Joemon and YUAN, FAJIE
Authors: Yuan, F., Zhang, G., Karatzoglou, A., Jose, J., Kong, B., and Li, Y.
College/School:College of Science and Engineering > School of Computing Science
Journal Name:Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval
Publisher:ACM
ISBN:9781450380379
Copyright Holders:Copyright © 2021 Association for Computing Machinery
First Published:First published in 44th International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR '21): 696-705
Publisher Policy:Reproduced in accordance with the publisher copyright policy

University Staff: Request a correction | Enlighten Editors: Update this record