摘要
针对以评分信息做辅助推荐时数据稀疏和深层次语义信息无法学习的问题,提出了一种新的推荐模型。以隐式反馈评分矩阵作为深度自编码器的原始输入,通过编码解码操作,实现评分信息的特征学习;用户电影类型矩阵为模型嵌入层的输入,经过平坦层和全连接层的操作,实现类型文本信息的特征学习;同时,使用BERT+BiLSTM结构对电影标题文本进行上下文信息的特征提取和特征学习。3种特征融合后,通过自编码器的处理得到预测评分。以Movielens 1M和Movielens 100k为数据集,平均绝对误差和均方误差为评价指标,SVD、PMF、PMMMF、SCC、RMbDn、Hern为对比模型。结果表明:本文模型在MAE上分别降低到0.0458和0.0460,在MSE上分别降低到0.0273和0.0390,优于对比算法,新的推荐模型性能提升效果较好。
A new recommendation model is proposed to solve the problems of data sparsity and deep semantic information learning when using score information as auxiliary recommendation.Feature learning of scoring information is achieved through depth auto-encoder to encode and decode implicit feedback scoring matrix.And the features of type text information are learned through the Flatten Layer and Full-connection Layer operate,with user-film type matrix being the input to the Embedded Layer.Meanwhile,the model uses BERT+BiLSTM structure to realize features learning and features splicing of contextual information on film title text.After the fusion of the three features,the prediction score is obtained by auto-encoder processing.Movielens 1 M and Movielens 100 k are used as datasets,mean absolute error and mean square error are used as evaluation indexes,and SVD,PMF,PMMMF,SCC,RMbDn and Hern are used as comparison models.The results show that the MAE values of the model are reduced to 0.0458 and 0.0460 respectively,and the MSE values of the model are reduced to 0.0273 and 0.0390 respectively,which are better than the results of the comparison algorithms.The new recommendation model achieves performance improvement.
作者
陈金广
徐心仪
范刚龙
CHEN Jinguang;XU Xinyi;FAN Ganglong(School of Computer Science,Xi’an Polytechnic University,Xi’an 710048,China;Henan Key Laboratory for Big Data Processing & Analytics of Electronic Commerce,Luoyang 471934,Henan,China;Electronic Commerce College,Luoyang Normal University,Luoyang 471934,Henan,China)
出处
《西安工程大学学报》
CAS
2021年第5期100-106,共7页
Journal of Xi’an Polytechnic University
基金
陕西省教育厅科研计划项目(21JP049)
河南省电子商务大数据处理与分析重点实验室开放课题资助项目(2020-KF-7)。