摘要
针对BERT-TCN模型在提取文本词向量的上下文语义特征时忽略了词语在句子级别的情感权重,未考虑不同词语对句子情感表达的贡献度不同的问题,提出一种融合BERT-TCN和注意力机制的文本情感分析模型,进一步提高模型的分类效果。该模型首先通过基于变换器的双向编码器表征技术(Bidirectional Encoder Representation from Transformers,BERT)模型获得包含上下文语义的文本词向量,然后利用时间卷积网络(Temporal Convolutional Network,TCN)模型进一步提取文本词向量的上下文语义特征,并引入注意力机制重点关注上下文中重要的情感特征,最后通过Softmax分类器进行情感分类。实验表明,与BERT-TCN模型相比,该模型在准确率、召回率和F1值上均有所提高。
Aiming at the problem that the BERT-TCN model ignored the sentiment weight of words at the sentence level when extracting the contextual semantic features of word representations of text and did not consider the different contributions of different words to the sentiment expression of the sentence,a text sentiment analysis model combining BERT-TCN and attention mechanism is proposed to improve the classification effect of the model further.The model first obtains the word representations containing context semantics of input text through Bidirectional Encoder Representation from Transformers(BERT),then the context semantic features of the word representations are further extracted by Temporal Convolutional Network(TCN),and the attention mechanism is introduced to focus on the critical sentiment features of the text.Finally,sentiment classification is performed by softmax.The experiments show that compared with the BERT-TCN model,the proposed model has improved accuracy,precision,recall,and F1 value.
作者
张剑
ZHANG Jian(School of Mathematics&Physics,Anhui Jianzhu University,Hefei Anhui 230601,China)
出处
《信息与电脑》
2022年第22期77-82,共6页
Information & Computer
关键词
深度学习
情感分析
注意力机制
deep learning
sentiment analysis
attention mechanism