期刊文献+
共找到4篇文章
< 1 >
每页显示 20 50 100
基于BERT和LightGBM的文本关键词提取方法 被引量:5
1
作者 何传鹏 尹玲 +4 位作者 黄勃 王明胜 郭茹燕 张帅 巨家骥 《电子科技》 2023年第3期7-13,共7页
传统的文本关键词提取方法忽略了上下文语义信息,不能解决一词多义问题,提取效果并不理想。基于LDA和BERT模型,文中提出LDA-BERT-LightG BM(LB-LightG BM)模型。该方法选择LDA主题模型获得每个评论的主题及其词分布,根据阈值筛选出候选... 传统的文本关键词提取方法忽略了上下文语义信息,不能解决一词多义问题,提取效果并不理想。基于LDA和BERT模型,文中提出LDA-BERT-LightG BM(LB-LightG BM)模型。该方法选择LDA主题模型获得每个评论的主题及其词分布,根据阈值筛选出候选关键词,将筛选出来的词和原评论文本拼接在一起输入到BERT模型中,进行词向量训练,得到包含文本主题词向量,从而将文本关键词提取问题通过LightG BM算法转化为二分类问题。通过实验对比了textrank算法、LDA算法、LightG BM算法及文中提出的LB-LightG BM模型对文本关键词提取的准确率P、召回率R以及F1。结果表明,当Top N取3~6时,F1的平均值比最优方法提升3.5%,该方法的抽取效果整体上优于实验中所选取的对比方法,能够更准确地发现文本关键词。 展开更多
关键词 主题模型 词向量 BERT LightGBM 候选关键词 关键词提取 文本主题 关键词
下载PDF
基于BERT与Loc-Attention的文本情感分析模型
2
作者 何传鹏 黄勃 +3 位作者 周科亮 尹玲 王明胜 李佩佩 《传感器与微系统》 CSCD 北大核心 2023年第12期146-150,共5页
传统的情感分析方法由于没有关注文本相对于主题词的位置(Loc)关系,分类效果并不理想。提出一种基于BERT与LDA的Loc-注意力(Attention)的双向长短期记忆(Bi-LSTM)模型的文本情感分析方法——BL-LABL方法。使用LDA主题模型获得每个评论... 传统的情感分析方法由于没有关注文本相对于主题词的位置(Loc)关系,分类效果并不理想。提出一种基于BERT与LDA的Loc-注意力(Attention)的双向长短期记忆(Bi-LSTM)模型的文本情感分析方法——BL-LABL方法。使用LDA主题模型获得每个评论的主题及其词分布,将筛选出的主题词和原文本拼接输入到BERT模型,进行词向量训练,得到包含主题信息的文本词向量以及包含文本信息的主题词向量;利用Bi-LSTM网络,加入文本的位置权重,结合注意力权重最终得到的文本特征表示为两者的加权求和;最后,再利用SoftMax分类器获得文本的情感类别。通过在两种数据集上的实验表明,该模型与传统的注意力情感分类模型相比,有效地提高了分类性能。 展开更多
关键词 情感分析 主题模型 BERT模型 文本特征 位置权重 注意力
下载PDF
A Short Text Classification Model for Electrical Equipment Defects Based on Contextual Features 被引量:1
3
作者 LI Peipei ZENG Guohui +5 位作者 HUANG Bo YIN Ling SHI Zhicai he chuanpeng LIU Wei CheN Yu 《Wuhan University Journal of Natural Sciences》 CAS CSCD 2022年第6期465-475,共11页
The defective information of substation equipment is usually recorded in the form of text. Due to the irregular spoken expressions of equipment inspectors, the defect information lacks sufficient contextual informatio... The defective information of substation equipment is usually recorded in the form of text. Due to the irregular spoken expressions of equipment inspectors, the defect information lacks sufficient contextual information and becomes more ambiguous.To solve the problem of sparse data deficient of semantic features in classification process, a short text classification model for defects in electrical equipment that fuses contextual features is proposed. The model uses bi-directional long-short term memory in short text classification to obtain the contextual semantics of short text data. Also, the attention mechanism is introduced to assign weights to different information in the context. Meanwhile, this model optimizes the convolutional neural network parameters with the help of the genetic algorithm for extracting salient features. According to the experimental results, the model can effectively realize the classification of power equipment defect text. In addition, the model was tested on an automotive parts repair dataset provided by the project partners, thus enabling the effective application of the method in specific industrial scenarios. 展开更多
关键词 short text classification genetic algorithm convolutional neural network attention mechanism
原文传递
A Fault Diagnosis Model for Complex Industrial Process Based on Improved TCN and 1D CNN
4
作者 WANG Mingsheng HUANG Bo +4 位作者 he chuanpeng LI Peipei ZHANG Jiahao CheN Yu TONG Jie 《Wuhan University Journal of Natural Sciences》 CAS CSCD 2022年第6期453-464,共12页
Fast and accurate fault diagnosis of strongly coupled, time-varying, multivariable complex industrial processes remain a challenging problem. We propose an industrial fault diagnosis model. This model is established o... Fast and accurate fault diagnosis of strongly coupled, time-varying, multivariable complex industrial processes remain a challenging problem. We propose an industrial fault diagnosis model. This model is established on the base of the temporal convolutional network(TCN) and the one-dimensional convolutional neural network(1DCNN). We add a batch normalization layer before the TCN layer, and the activation function of TCN is replaced from the initial ReLU function to the LeakyReLU function. To extract local correlations of features, a 1D convolution layer is added after the TCN layer, followed by the multi-head selfattention mechanism before the fully connected layer to enhance the model’s diagnostic ability. The extended Tennessee Eastman Process(TEP) dataset is used as the index to evaluate the performance of our model. The experiment results show the high fault recognition accuracy and better generalization performance of our model, which proves its effectiveness. Additionally, the model’s application on the diesel engine failure dataset of our partner’s project validates the effectiveness of it in industrial scenarios. 展开更多
关键词 fault diagnosis temporal convolutional network self-attention mechanism convolutional neural network
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部