摘要
针对会话推荐本身存在的噪声干扰和样本稀疏性问题,提出一种基于对比超图转换器的会话推荐(CHT)模型。首先,将会话序列建模为超图;其次,通过超图转换器构建项目的全局上下文信息和局部上下文信息。最后,在全局关系学习上利用项目级(I-L)编码器和会话级(S-L)编码器捕获不同级别的项目嵌入,经过信息融合模块进行项目嵌入和反向位置嵌入融合,并通过软注意力模块得到全局会话表示,而在局部关系学习上借助权重线图卷积网络生成局部会话表示。此外,引入对比学习范式最大化全局会话表示和局部会话表示之间的互信息,以提高推荐性能。在多个真实数据集上的实验结果表明,CHT模型的推荐性能优于目前的主流模型。相较于次优模型S2-DHCN(Self-Supervised Hypergraph Convolutional Networks),在Tmall数据集上,所提模型的P@20最高达到了35.61%,MRR@20最高达到了17.11%,分别提升了13.34%和13.69%;在Diginetica数据集上,所提模型的P@20最高达到了54.07%,MRR@20最高达到了18.59%,分别提升了0.76%和0.43%,验证了所提模型的有效性。
A Contrastive Hypergraph Transformer for session-based recommendation(CHT)model was proposed to address the problems of noise interference and sample sparsity in the session-based recommendation itself.Firstly,the session sequence was modeled as a hypergraph.Secondly,the global context information and local context information of items were constructed by the hypergraph transformer.Finally,the Item-Level(I-L)encoder and Session-Level(S-L)encoder were used on global relationship learning to capture different levels of item embeddings,the information fusion module was used to fuse item embedding and reverse position embedding,and the global session representation was obtained by the soft attention module while the local session representation was generated with the help of the weight line graph convolutional network on local relationship learning.In addition,a contrastive learning paradigm was introduced to maximize the mutual information between the global and local session representations to improve the recommendation performance.Experimental results on several real datasets show that the recommendation performance of CHT model is better than that of the current mainstream models.Compared with the suboptimal model S2-DHCN(Self-Supervised Hypergraph Convolutional Networks),the proposed model has the P@20 of 35.61%and MRR@20 of 17.11%on Tmall dataset,which are improved by 13.34%and 13.69%respectively;the P@20 reached 54.07%and MRR@20 reached 18.59%on Diginetica dataset,which are improved by 0.76%and 0.43%respectively;verifying the effectiveness of the proposed model.
作者
党伟超
程炳阳
高改梅
刘春霞
DANG Weichao;CHENG Bingyang;GAO Gaimei;LIU Chunxia(College of Computer Science and Technology,Taiyuan University of Science and Technology,Taiyuan Shanxi 030024,China)
出处
《计算机应用》
CSCD
北大核心
2023年第12期3683-3688,共6页
journal of Computer Applications
基金
太原科技大学博士科研启动基金资助项目(20202063)
太原科技大学研究生教育创新项目(SY2022063)。
关键词
会话推荐
超图转换器
对比学习
注意力机制
session-based recommendation
hypergraph transformer
contrastive learning
attention mechanism