期刊文献+

基于多注意力机制的机器阅读理解模型

Machine reading comprehension model based on multi-attention mechanism
下载PDF
导出
摘要 为解决以往机器阅读理解模型中存在的长距离依赖、特征提取单一、信息冗余等问题,在bi-directional attention flow network(BiDAF)的基础上,提出改进模型。通过引入自注意机制捕获序列的内部特征,进一步融合问题和文档的信息,解决长距离依赖问题;引入位置信息,从多方面提取文档特征;采用余弦相似度的方法,调整文档的嵌入向量,解决信息冗余问题;在SQuAD数据集上进行验证。实验结果表明,与原基准模型BiDAF相比,改进后的模型在精确匹配和模糊匹配两项性能指标上的结果都有提升,验证了所提模型的有效性。 To solve the problems of long-distance dependence,single feature extraction and information redundancy in the previous machine reading comprehension models,an improved model was proposed based on bi-directional attention flow network(BiDAF).By introducing the self-attention mechanism to capture the internal characteristics of the sequence,the information of the problem and the document was further integrated,and the problem of long-distance dependence was solved.The location information was introduced to extract text features from many aspects.The cosine similarity method was used to adjust the embedding vector of the document to solve the problem of information redundancy.It was verified on the squad data set.Experimental results show that compared with the original benchmark model BiDAF,the improved model improves the results of accurate matching and fuzzy matching,which verifies the effectiveness of the proposed model.
作者 梁燕 张文普 刘超 朱清 LIANG Yan;ZHANG Wen-pu;LIU Chao;ZHU Qing(School of Communication and Information Engineering,Chongqing University of Posts and Telecommunications,Chongqing 400065,China;Key Laboratory of Signal and Information Processing of Chongqing,Chongqing University of Posts and Telecommunications,Chongqing 400065,China)
出处 《计算机工程与设计》 北大核心 2023年第6期1907-1913,共7页 Computer Engineering and Design
基金 国家自然科学基金项目(61702066) 重庆市教委科学技术重点研究基金项目(KJZD-M201900601)。
关键词 深度学习 机器阅读理解 循环神经网络 特征提取 注意力机制 余弦相似度 答案预测 deep learning machine reading comprehension recurrent neural network feature extraction attention mechanism cosine similarity answer prediction
  • 相关文献

参考文献3

二级参考文献12

共引文献45

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部