摘要
循环神经网络(RNN)广泛应用于时间序列的自然语言处理任务中,由于RNN网络的顺序特性,所有输入都会被读取到网络中,推理速度随输入长度的线性增加而减慢。为了解决这个问题,本研究提出一种基于略读模块的长短期记忆网络(LSTM)模型,在推理时快速读取答案相关内容,忽略重要句子中无关的词和文本中不重要的部分。本模型适用于文本分类、情感分析和阅读理解等多类文本任务,为了确保和标准LSTM精度相同,选取了5个同类快速阅读模型进行实验对比,实验结果表明,本模型消耗的浮点运算次数(FLOP)较少,阅读推理过程快。
Recurrent neural networks(RNN)have been extensively used in natural language processing tasks with time sequence.Due to the sequential characteristics of the RNN,all inputs are read by the networks,and the inference speed slows down as the length of the input increases linearly.In order to solve this problem,this work proposes a long short-term memory network(LSTM)model based on skimming module,which can fast read the relevant content of the answers while reasoning,skipping irrelevant words in important sentences and also unimportant parts of the text.This model can apply to various text tasks,such as text classification,emotional analysis and reading comprehension.To ensure the same accuracy as the standard LSTM,five fast reading models of the same type are chosen for experimental comparison.It has been verified that this model performs a smaller number of floating-point operations(FLOP),and the reading and reasoning processes are faster.
作者
侯磊
蒙会民
李旭
兰振平
HOU Lei;MENG Huiming;LI Xu;LAN Zhenping(School of Information Science and Engineering, Dalian Polytechnic University, Dalian 116034, China)
出处
《大连工业大学学报》
CAS
北大核心
2022年第2期136-141,共6页
Journal of Dalian Polytechnic University
基金
辽宁省教育厅科学研究项目(J2020113).