期刊文献+

面向文本分类的BERT-CNN模型 被引量:2

BERT-CNN model for text classification
下载PDF
导出
摘要 在深度学习中,Word2Vec、Glove和Fasttext等已成为在文本分类任务中获取词表示的主要方式,但其词向量都是静态表示的,无法充分提取语义信息。为了解决此问题,提高文本分类的精度,提出了BERT(bidirectional encoder representations from transformers)模型与卷积神经网络(convolutional neural networks, CNN)模型相结合的BERT-CNN模型。首先利用BERT中的自注意力机制获取词与词之间的语义关系,然后通过CNN提取文本特征,最后通过全连接层进行二分类。实验结果表明,在文本分类研究中,相比于Word2Vec-CNN和Glove-CNN,BERT-CNN在准确率上分别提升了10.07%和7.07%,效果有显著的提高。 In deep learning,Word2Vec,Glove and Fasttext have become the main ways to obtain word representation in text classification tasks,but these word vectors are statically represented and cannot fully extract semantic information.In order to solve this problem and improve the accuracy of text classification,a BERT-CNN model combining BERT(bidirectional encoder representations from transformers)and CNN(convolutional neural network)model was proposed.Firstly,the semantic relationship between words was obtained by self-attention in BERT,then text features were extracted by CNN.Finally,binary classification was carried out by full connection layer.The experimental results show that,compared with Word2Vec-CNN and Glove-CNN,the accuracy of BERT-CNN is improved by 10.07%and 7.07%respectively,and the effect is significantly improved.
作者 秦全 易军凯 QIN Quan;YI Junkai(School of Automation,Beijing Information Science&Technology University,Beijing 100192,China)
出处 《北京信息科技大学学报(自然科学版)》 2023年第2期69-74,共6页 Journal of Beijing Information Science and Technology University
基金 国家自然科学基金资助项目(U1636208)。
关键词 文本分类 BERT模型 卷积神经网络 词向量 text classification BERT model convolutional neural network word vector
  • 相关文献

参考文献5

二级参考文献19

共引文献100

同被引文献9

引证文献2

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部