Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/164144
Title: Memory bank augmented long-tail sequential recommendation
Authors: Hu, Yidan
Liu, Yong
Miao, Chunyan
Miao, Yuan
Keywords: Engineering::Computer science and engineering
Issue Date: 2022
Source: Hu, Y., Liu, Y., Miao, C. & Miao, Y. (2022). Memory bank augmented long-tail sequential recommendation. 31st ACM International Conference on Information and Knowledge Management (CIKM 2022), 791-801. https://dx.doi.org/10.1145/3511808.3557391
Project: AISG-GC-2019-003 
Abstract: The goal of sequential recommendation is to predict the next item that a user would like to interact with, by capturing her dynamic historical behaviors. However, most existing sequential recommendation methods do not focus on solving the long-tail item recommendation problem that is caused by the imbalanced distribution of item data. To solve this problem, we propose a novel sequential recommendation framework, named MASR (ie <u>M</u>emory Bank <u>A</u>ugmented Long-tail <u>S</u>equential <u>R</u>ecommendation). MASR is an "Open-book"model that combines novel types of memory banks and a retriever-copy network to alleviate the long-tail problem. During inference, the designed retriever-copy network retrieves related sequences from the training samples and copies the useful information as a cue to improve the recommendation performance on tail items. Two designed memory banks provide reference samples to the retriever-copy network by memorizing the historical samples appearing in the training phase. Extensive experiments have been performed on five real-world datasets to demonstrate the effectiveness of the proposed MASR model. The experimental results indicate that MASR consistently outperforms baseline methods in terms of recommendation performance on tail items.
URI: https://hdl.handle.net/10356/164144
ISBN: 9781450392365
DOI: 10.1145/3511808.3557391
Rights: © 2022 Association for Computing Machinery. All rights reserved.
Fulltext Permission: none
Fulltext Availability: No Fulltext
Appears in Collections:SCSE Conference Papers

Page view(s)

28
Updated on Feb 5, 2023

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.