摘要
针对营销新闻分类识别任务,传统方法采用的长短期记忆神经网络LSTM和卷积神经网络CNN存在分类识别率不高的问题,因此提出一种融合CNN和引入注意力机制的长短时记忆(LSTMAttention)来提高营销新闻识别分类能力。首先通过word2vec获取营销新闻文本词向量形成的矩阵,分别输入到传统机器学习分类模型中,在此基础上使用模型融合技术融合单一模型中分类效果较好的模型,最后得到融合模型和单一模型的分类结果并进行对比。实验结果显示,在基础模型LSTM引入了注意力机制之后准确率,召回率和F1值分别达到67.01%,66.07%,0.680,而CNN和LSTMAttention进行模型融合之后的准确率,召回率和F1值进一步达到了68.29%,71.27%,0.692。表明基于CNN和LSTMAttention融合之后的神经网络模型相较于单一模型,最终分类效果更好,可以达到提高营销新闻文本分类识别效果的目的。
For the classification and recognition task of marketing news,the long-term and short-term memory neural network LSTM and convolutional neural network CNN used by traditional methods have a low classification recognition rate.Therefore,we propose a long and short-term memory(LSTMAttention)that combines CNN and introduces attention mechanism to improve the ability to identify and classify marketing news.Firstly,the matrix formed by the word vector of the marketing news text is obtained by word2vec and input into the traditional machine learning classification model.Based on this,model fusion technology is used to fuse the model with better classification in a single model,and finally the classification results of fusion model and single model are obtained and compared.The experiment shows that after introduction of the attention mechanism in the basic model of LSTM,the accuracy,recall and F1 values reach 67.01%,66.07%and 0.680 respectively,and the accuracy,recall and F1 value after model fusion of CNN and LSTMAttention further reach 68.29%,71.27%and 0.692.It is shown that the neural network model based on the fusion of CNN and LSTMAttention has a better final classification effect than a single model,and can achieve the purpose of improving the classification and recognition effect of marketing news text.
作者
刘高军
王小宾
LIU Gao-jun;WANG Xiao-bin(School of Information,North China University of Technology,Beijing 100144,China)
出处
《计算机技术与发展》
2020年第11期59-63,共5页
Computer Technology and Development
基金
国家自然科学基金(61672040)
新闻出版业科技与标准重点实验室项目(4020548418X8)。
关键词
营销新闻
文本分类
卷积神经网络
注意力机制
长短期记忆神经网络
marketing news
text classification
convolutional neural network
attention mechanism
long short-term memory neural network