摘要
针对目前检索式多轮对话深度注意力机制模型(Deep Attention Matching Network,DAM)候选回复细节不匹配和语义混淆的问题,该文提出基于多头注意力和双向长短时记忆网络(BiLSTM)改进DAM模型的中文问答匹配方法,采用多头注意力机制,使模型有能力建模较长的多轮对话,更好地处理目标回复与上下文的匹配关系。此外,该文在特征融合过程中采用BiLSTM模型,通过捕获多轮对话中的序列依赖关系,进一步提升选择目标候选回复的准确率。该文在豆瓣和电商两个开放数据集上进行实验,实验性能均优于DAM基线模型,R_(10)@1指标在含有词向量增强的情况下提升了1.5%。
To effectively match response details and avoid semantic confusion,this paper proposes to improve the Deep Attention Matching Network(DAM)via multi-head attention and Bi-directional Long Short-Term Memory(BiLSTM).This method can model longer multi-round of dialogue and handle the matching relationship between the response selection and the context.In addition,the BiLSTM Network applied in the feature fusion process can improve the accuracy of multi-turn response selection tasks by capturing the time-dependent relation.Tested on two public multi-turn response selection datasets,the Douban Conversion Corpus and the E-commerce Dialogue Corpus,our model is revealed to outperform the baseline model by 1.5%in R_(10)@1 with the word vector enhancement.
作者
秦汉忠
于重重
姜伟杰
赵霞
QIN Hanzhong;YU Chongchong;JIANG Weijie;ZHAO Xia(School of Artificial Intelligence,Beijing Technology and Business University,Beijing 100048,China)
出处
《中文信息学报》
CSCD
北大核心
2021年第11期118-126,共9页
Journal of Chinese Information Processing
基金
教育部人文社会科学研究规划基金(16YJAZH072)
国家社会科学基金(14ZDB156)
关键词
检索式多轮对话
DAM
多头注意力
BiLSTM
multi-turn response selection
deep attention matching
multi-head attention
bi-directional long short-term memory