Contrastive graph prompt-tuning for cross-domain recommendation

Yi, Z., Ounis, I. and Macdonald, C. (2024) Contrastive graph prompt-tuning for cross-domain recommendation. ACM Transactions on Information Systems, 42(2), 60. (doi: 10.1145/3618298)

[img] Text
305806.pdf - Accepted Version

1MB

Abstract

Recommender systems commonly suffer from the long-standing data sparsity problem where insufficient user-item interaction data limits the systems’ ability to make accurate recommendations. This problem can be alleviated using cross-domain recommendation techniques. In particular, in a cross-domain setting, knowledge sharing between domains permits improved effectiveness on the target domain. While recent cross-domain recommendation techniques used a pre-training configuration, we argue that such techniques lead to a low fine-tuning efficiency, especially when using large neural models. In recent language models, prompts have been used for parameter-efficient and time-efficient tuning of the models on the downstream tasks - these prompts represent a tunable latent vector that permits to freeze the rest of the language model’s parameters. To address the cross-domain recommendation task in an efficient manner, we propose a novel Personalised Graph Prompt-based Recommendation (PGPRec) framework, which leverages the efficiency benefits from prompt-tuning. In such a framework, we develop personalised and item-wise graph prompts based on relevant items to those items the user has interacted with. In particular, we apply Contrastive Learning (CL) to generate the pre-trained embeddings, to allow an increased generalisability in the pre-training stage and to ensure an effective prompt-tuning stage. To evaluate the effectiveness of our PGPRec framework in a cross-domain setting, we conduct an extensive evaluation with the top-k recommendation task and perform a cold-start analysis. The obtained empirical results on four Amazon Review datasets show that our proposed PGPRec framework can reduce up to 74% of the tuned parameters with a competitive performance and achieves an 11.41% improved performance compared to the strongest baseline in a cold-start scenario.

Item Type:Articles
Status:Published
Refereed:Yes
Glasgow Author(s) Enlighten ID:Macdonald, Professor Craig and Ounis, Professor Iadh and Yi, Zixuan
Authors: Yi, Z., Ounis, I., and Macdonald, C.
College/School:College of Science and Engineering
College of Science and Engineering > School of Computing Science
Journal Name:ACM Transactions on Information Systems
Publisher:ACM
ISSN:1046-8188
ISSN (Online):1558-2868

University Staff: Request a correction | Enlighten Editors: Update this record