期刊文献+

基于Transformer与异质图神经网络的新闻推荐模型 被引量:3

News Recommendation ModelBased on Transformer and Heterogenous Graph Neural Network
下载PDF
导出
摘要 新闻推荐方法大多假定用户浏览的新闻之间具有很强的时序依赖关系,但新闻具有更新的快速性及用户阅读的自由性等特点,使时序性建模中可能会引入噪音.为了解决此问题,文中提出基于Transformer与异质图神经网络的新闻推荐模型.采用Transformer从用户近期浏览的新闻中对用户的短期兴趣进行建模,通过异质图神经网络捕捉用户和新闻之间的高阶关系,建模用户长期兴趣和候选新闻的表示.同时,为了自适应调整短期兴趣和长期兴趣在用户建模时的重要性,设计用户长短期兴趣感知的点击预测机制.在真实数据集上的对比实验验证文中模型的有效性. In most of the existing news recommendation models,it is assumed that there is strong temporal dependence among the news items browsed by users.However,noise may be introduced into temporal modeling due to the rapidity of news updates and the freedom for users to read.To solve the problem,a news recommendation model based on Transformer and heterogenous graph neural network is proposed.Different from the neural network model based on time series,Transformer is employed to model the users′short-term interests from the recent reading history.Using heterogenous graph neural networks,users′long-term interests and candidate news representations are modeled by capturing the high-order relationship information between users and news.Meanwhile,a long and short-term interests aware mechanism is designed to adaptively adjust the importance of users′long-term and short-term interests in news recommendation.Experiments on a real-world dataset demonstrate the effectiveness of the proposed model.
作者 张玉朋 李香菊 李超 赵中英 ZHANG Yupeng;LI Xiangju;LI Chao;ZHAO Zhongying(College of Computer Science and Engineering,Shandong University of Science and Technology,Qingdao 266590;College of Electronic and Information Engineering,Shandong University of Science and Technology,Qingdao 266590)
出处 《模式识别与人工智能》 EI CSCD 北大核心 2022年第9期839-848,共10页 Pattern Recognition and Artificial Intelligence
基金 国家自然科学基金项目(No.62072288,61702306)资助。
关键词 新闻推荐 异质图神经网络 注意力机制 TRANSFORMER News Recommendation Heterogenous Graph Neural Network Attention Mechanism Transformer
  • 相关文献

参考文献1

二级参考文献10

共引文献19

同被引文献12

引证文献3

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部