摘要
矩阵分解由于其较好的评分预测能力而被广泛应用于的个性化推荐中,很多模型也在矩阵分解的基础上改进以提升推荐性能。但是,这些模型由于获取用户偏好信息的能力有限而导致其推荐效果不佳。为了充分挖掘用户的偏好信息,提出了深度层次注意矩阵分解(DeepHAMF)的推荐模型。首先,对于原始数据除了输入到多层感知机之外,还采用自注意力机制编码后再输入到多层感知机中,目标是捕获显式偏好信息,并将这部分命名为自注意力层;其次,将原始矩阵分解与注意力编码之后的矩阵分解结果分别与多层感知机输出的结果通过注意力机制融合,这样能够充分挖掘出用户的潜在偏好信息,这部分命名为层次注意力模块;最后,通过残差网络将层次注意力模块和自注意力层进行信息拟合,这部分命名为残差融合层。在公开评分数据集上的实验结果表明,DeepHAMF比现有的评分预测模型效果更好。
Matrix factorization is widely used in personalized recommendation because of its better ability of rating prediction,so many models based on matrix factorization is designed to improve the performance of recommendation.However,the limited ability of these models to mine users’potential preference information results in unsatisfactory recommendation effect.In order to mine preferences of user and obtain better recommendation effect,a Deep Hierarchical Attention Matrix Factorization method(DeepHAMF)is proposed.Firstly,the original data input into the multi-layer perceptron,and the self-attention mechanism is also used to encode the input into the multi-layer perceptron,which aims to capture the original preference information.This part is called self-attention layer.Secondly,the original matrix factorization results and the matrix factorization results after attention operation are fused with the output results of multi-layer perceptron respectively by attention mechanism,so the user’s prefe-rence information can be fully mined.This part is called self-attention layer.Last but not the least,results of self-attention and hierarchical attention are fitting by the residual network module.Experimental results on public rating data sets show that DeepHAMF outperforms existing rating prediction algorithms.
作者
李建红
苏晓倩
吴彩虹
LI Jian-hong;SU Xiao-qian;WU Cai-hong(School of Artificial Intelligence,Anhui University of Science and Technology,Huainan 232001;School of Safety Science and Engineering,Anhui University of Science and Technology,Huainan 232001,China)
出处
《计算机工程与科学》
CSCD
北大核心
2023年第1期28-36,共9页
Computer Engineering & Science
基金
淮南市市级指导性科技计划项目(2021003)
安徽理工大学校级重点项目(xjzd2020-15)。
关键词
层次注意力
自注意力网络
残差融合
矩阵分解
hierarchical attention
self-attention network
residual fusion
matrix factorization