期刊文献+

基于RoBERTa和图增强Transformer的序列推荐方法 被引量:2

Sequence Recommendation Method Based on RoBERTa and Graph-Enhanced Transformer
下载PDF
导出
摘要 自推荐系统出现以来,有限的数据信息就一直制约着推荐算法的进一步发展。为降低数据稀疏性的影响,增强非评分数据的利用率,基于神经网络的文本推荐模型相继被提出,但主流的卷积或循环神经网络在文本语义理解和长距离关系捕捉方面存在明显劣势。为了更好地挖掘用户与商品之间的深层潜在特征,进一步提高推荐质量,提出一种基于Ro BERTa和图增强Transformer的序列推荐(RGT)模型。引入评论文本数据,首先利用预训练的Ro BERTa模型捕获评论文本中的字词语义特征,初步建模用户的个性化兴趣,然后根据用户与商品的历史交互信息,构建具有时序特性的商品关联图注意力机制网络模型,通过图增强Transformer的方法将图模型学习到的各个商品的特征表示以序列的形式输入Transformer编码层,最后将得到的输出向量与之前捕获的语义表征以及计算得到的商品关联图的全图表征输入全连接层,以捕获用户全局的兴趣偏好,实现用户对商品的预测评分。在3组真实亚马逊公开数据集上的实验结果表明,与Deep FM、Conv MF等经典文本推荐模型相比,RGT模型在均方根误差(RMSE)和平均绝对误差(MAE)2种指标上有显著提升,相较于最优对比模型最高分别提升4.7%和5.3%。 Since the emergence of recommendation systems,further development of recommendation algorithms has been constrained by limited data.To reduce the impact of data sparsity and enhance the utilization of nonrated data,text-recommendation models based on neural networks have been successively proposed.However,mainstream convolutional and recurrent neural networks have clear disadvantages as concerns text semantic understanding and capturing long-distance relationships.To better explore the deep latent features between users and items,and further improve the quality of recommendations,a sequence recommendation method based on RoBERTa and a Graph-enhanced Transformer(RGT)is proposed.This model incorporates textual comment data by first utilizing a pre-trained RoBERTa model to capture the semantic features of words in the comment text,thereby modeling the personalized interests of the user.Subsequently,based on historical interaction information between users and items,a graph attention mechanism network model with the temporal characteristics of item associations is constructed.Using the graph-enhanced Transformer method,the feature representations of various items learned by the graph model are sequentially input to the Transformer encoding layer.Finally,the obtained output vectors,along with the previously captured semantic and computed global representations of the item association graph,are input into a fully connected layer to capture the global interest preferences of the user and achieve prediction ratings for items.The experimental results on three groups of real Amazon public datasets demonstrate that the proposed recommendation model significantly improves the Root Mean Square Error(RMSE)and Mean Absolute Error(MAE)compared to several existing classical text recommendation models,such as DeepFM and ConvMF.Compared to the optimal comparison model,the highest increases are 4.7%and 5.3%,respectively.
作者 王明虎 石智奎 苏佳 张新生 WANG Minghu;SHI Zhikui;SU Jia;ZHANG Xinsheng(College of Management,Xi'an University of Architecture and Technology,Xi'an 710055,Shaanxi,China)
出处 《计算机工程》 CAS CSCD 北大核心 2024年第4期121-131,共11页 Computer Engineering
基金 陕西省科技厅重点产业创新链(群)-工业领域项目(2022ZDLGY16-04) 陕西省教育厅哲学社会科学重点项目(21JZ035) 西安建筑科技大学自然科学专项(1609720032)。
关键词 推荐算法 评论文本 RoBERTa模型 图注意力机制 Transformer机制 recommendation algorithm review text RoBERTa model graph attention mechanism Transformer mechanism
  • 相关文献

参考文献22

二级参考文献124

共引文献1874

同被引文献11

引证文献2

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部