论文标题

MAMO:记忆增强的元优化,以进行冷启动推荐

MAMO: Memory-Augmented Meta-Optimization for Cold-start Recommendation

论文作者

Dong, Manqing, Yuan, Feng, Yao, Lina, Xu, Xiwei, Zhu, Liming

论文摘要

对于大多数当前推荐系统而言,一个普遍的挑战是冷启动问题。由于缺乏用户项目的交互,精细调整的推荐系统无法处理新用户或新项目的情况。最近,一些作品将元优化思想介绍到建议方案中,即仅通过过去的几个相互作用的项目来预测用户偏好。核心想法是为所有用户学习一个全局共享初始化参数,然后分别学习每个用户的本地参数。但是,大多数基于元学习的推荐方法采用模型 - 静态的元学习来进行参数初始化,其中全局共享参数可能会将模型带入某些用户的本地Optima。在本文中,我们设计了两个可以存储特定任务记忆和特定特定记忆的内存矩阵。具体而言,特定于功能的记忆用于指导具有个性化参数初始化的模型,而特定于任务的记忆则用于指导模型快速预测用户偏好。我们采用一种元优化方法来优化所提出的方法。我们在两个广泛使用的建议数据集上测试了该模型,并考虑了四个冷启动情况。实验结果表明了所提出的方法的有效性。

A common challenge for most current recommender systems is the cold-start problem. Due to the lack of user-item interactions, the fine-tuned recommender systems are unable to handle situations with new users or new items. Recently, some works introduce the meta-optimization idea into the recommendation scenarios, i.e. predicting the user preference by only a few of past interacted items. The core idea is learning a global sharing initialization parameter for all users and then learning the local parameters for each user separately. However, most meta-learning based recommendation approaches adopt model-agnostic meta-learning for parameter initialization, where the global sharing parameter may lead the model into local optima for some users. In this paper, we design two memory matrices that can store task-specific memories and feature-specific memories. Specifically, the feature-specific memories are used to guide the model with personalized parameter initialization, while the task-specific memories are used to guide the model fast predicting the user preference. And we adopt a meta-optimization approach for optimizing the proposed method. We test the model on two widely used recommendation datasets and consider four cold-start situations. The experimental results show the effectiveness of the proposed methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源