摘要
In recent years, end-to-end models have been widely used in the fields of machine comprehension (MC) and question answering (QA). Recurrent neural network (RNN) or convolutional neural network (CNN) is combined with attention mechanism to construct models to improve their accuracy. However, a single attention mechanism does not fully express the meaning of the text. In this paper, recurrent neural network is replaced with the convolutional neural network to process the text, and a superimposed attention mechanism is proposed. The model was constructed by combining a convolutional neural network with a superimposed attention mechanism. It shows that good results are achieved on the Stanford question answering dataset (SQuAD).
出处
《国际计算机前沿大会会议论文集》
2019年第2期35-37,共3页
International Conference of Pioneering Computer Scientists, Engineers and Educators(ICPCSEE)