摘要
语义匹配是问答领域的一个核心任务,能够为问答系统和信息检索等领域提供技术支持.目前对于语义匹配这一特殊分类问题,神经网络主要使用交叉熵或者对比代价损失函数,忽略了损失函数的分类限制宽泛,导致其在分类边缘存在误差.为了解决此种问题,本文在已有的孪生神经网络的基础上,引入am-softmax损失函数,提升模型精确度,同时在现有的词向量和字向量作为网络输入的基础,进一步引入Attention机制,使模型进一步获取更多的文本信息.实验结果表明,与之前的深度学习模型相比,模型的性能有进一步提高.
Semantic matching is a core task in the field of question answering,which can provide technical support for questions answ ering systems and information retrieval. At present,for the special classification problem of semantic matching,neural netw orks mainly use cross entropy or contrast cost loss functions,ignoring the broad classification limitation of loss functions,w hich leads to errors at the edges of classification. In order to solve this problem,based on the existing tw in neural netw ork,this paper introduces the am-softmax loss function to improve the accuracy of the model. At the same time,the existing w ord vector and w ord vector are used as the basis of the netw ork input,and the Attention mechanism is further introduced. To make the model get more text information.Experimental results show that compared w ith previous deep learning models,the performance of the model is further improved.
作者
于碧辉
王加存
YU Bi-hui;WANG Jia-cun(Chinese Academy of Sciences University,Beijing 100049,China;Chinese Academy of Sciences University,Shenyang Institute of Computing Technology,Shenyang 110168,China)
出处
《小型微型计算机系统》
CSCD
北大核心
2021年第2期231-234,共4页
Journal of Chinese Computer Systems