In recent years, end-to-end models have been widely used in the fields of machine comprehension (MC) and question answering (QA). Recurrent neural network (RNN) or convolutional neural network (CNN) is combined with a...In recent years, end-to-end models have been widely used in the fields of machine comprehension (MC) and question answering (QA). Recurrent neural network (RNN) or convolutional neural network (CNN) is combined with attention mechanism to construct models to improve their accuracy. However, a single attention mechanism does not fully express the meaning of the text. In this paper, recurrent neural network is replaced with the convolutional neural network to process the text, and a superimposed attention mechanism is proposed. The model was constructed by combining a convolutional neural network with a superimposed attention mechanism. It shows that good results are achieved on the Stanford question answering dataset (SQuAD).展开更多
文摘In recent years, end-to-end models have been widely used in the fields of machine comprehension (MC) and question answering (QA). Recurrent neural network (RNN) or convolutional neural network (CNN) is combined with attention mechanism to construct models to improve their accuracy. However, a single attention mechanism does not fully express the meaning of the text. In this paper, recurrent neural network is replaced with the convolutional neural network to process the text, and a superimposed attention mechanism is proposed. The model was constructed by combining a convolutional neural network with a superimposed attention mechanism. It shows that good results are achieved on the Stanford question answering dataset (SQuAD).