期刊文献+

基于循环结构的卷积神经网络文本分类方法 被引量:14

Text classification method based on the cycle structured convolutional neural network
下载PDF
导出
摘要 现有卷积神经网络在文本分类性能上受到词向量窗口长度的影响,在研究卷积神经网络分类方法的基础上,提出一种基于循环结构的神经网络文本分类方法,该方法对文本进行单次正向及反向扫描,能够在学习单词表示时尽可能地捕获上下文信息,整体算法时间复杂度为O(n),是线性复杂度;该方法构建文本语义模型可以捕获长距离的依赖关系,使得词向量窗口长度对文本分类性能没有影响,对上下文更有效地建模。实验结果表明,该方法构建文本语义模型的准确率达到96.86%,召回率达到96.15%,F1值达到96.5%,性能优于传统文本分类算法和卷积神经网络方法。 The existing convolutional neural network is influenced by the length of the word vector window in the text classification performance.On the basis of studying the convolutional neural network classification method,a text classification method based on the cycle structured convolutional neural network is proposed in this paper.The method only needs a single forward and reverse scan of the text to get as much as possible context representation.In this paper,the time complexity of the whole algorithm is O(n),which is linear complexity.In addition,the method can capture the long distance dependency by constructing the text semantic model.The word vector window length has no effect on the text classification performance,which can get more efficient modeling of the context.The experimental results show that the accuracy rate of the text model is 96.86%,the recall rate is 96.15%,the F1 value is 96.5%,and the performance is superior to the traditional text classification algorithm and the convolution neural network method.
作者 陈波 CHEN Bo(School of Mathematics and Computer Science,Shaanxi University of Technology,Hanzhong 723001,P.R.China)
出处 《重庆邮电大学学报(自然科学版)》 CSCD 北大核心 2018年第5期705-710,共6页 Journal of Chongqing University of Posts and Telecommunications(Natural Science Edition)
基金 国家自然科学基金(61471133)~~
关键词 卷积神经网络 循环结构 文本语义模型 文本分类 convolutional neural network cycle structure text semantic model text classification
  • 相关文献

参考文献9

二级参考文献103

  • 1叶菲,罗景青,俞志富.一种改进的并行处理SVM学习算法[J].微电子学与计算机,2009,26(2):40-43. 被引量:6
  • 2黄果,周竹荣,周亭.基于语义网的信息检索研究[J].西南大学学报(自然科学版),2007,29(1):77-80. 被引量:12
  • 3陈世立,高野军.基于神经网络与贝叶斯的混合文本分类研究[J].情报杂志,2007,26(5):34-36. 被引量:3
  • 4Kulesza T, Stumpf S, Wong W K, et al. Why-oriented end-user debugging of naive bayes text classification [J]. ACM Transactions on Interactive Intelligent Sys- tems, 2011, 1 ( 1 ) ,doi : 10.1145/2030365. 2030367.
  • 5Hao Xiulan, Tao Xiaopeng, Zhang Chenghong, et al. An effective method to improve KNN text classifier [ C ] //Proceedings of the 8th ACIS International Conference on Software Engineering, Artficial Intelligence, Networ- king and Parallel/Distributed Computing. Quebec: IEEE Computer Society ,2007 : 379 -384.
  • 6Wang T Y, Chiang H M. One-against-one fuzzy support vector machine classifier: an approach to text categoriza- tion [ J ]. Expert Systems with Applications, 2009, 36 (6) : 10030 - 10034.
  • 7Mann G, McDonald R, Mohri M. Efficient large-scale distributed training of conditional maximum entropy models[ C] //Proceedings of Advances in Neural Infor- mation Processing Systems 22. Vancouver: Curran Asso- ciates, Inc. 2009 : 1231 - 1239.
  • 8Thurber K J, Wald L D. Associative and parallel pro- cessors [J]. Computing Survey, 1975, 7(4): 215- 255.
  • 9Lira H W, Lee S H, Yang K A, et al. In vitro molecu- lar pattern classification via DNA-based weighted-sum operation [J]. BioSystems, 2010, 100(1) :1 -7.
  • 10Zhang B T. Hypernetworks: a molecular evolutionary architecture for cognitive learning and memory [ J ]. IEEE Computational Intelligence Magazine, 2008, 3 (3) : 49 -63.

共引文献116

同被引文献94

引证文献14

二级引证文献117

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部