期刊文献+
共找到4篇文章
< 1 >
每页显示 20 50 100
Aspect-Level Sentiment Analysis Based on Deep Learning
1
作者 Mengqi Zhang Jiazhao Chai +2 位作者 Jianxiang Cao Jialing Ji Tong Yi 《Computers, Materials & Continua》 SCIE EI 2024年第3期3743-3762,共20页
In recent years,deep learning methods have developed rapidly and found application in many fields,including natural language processing.In the field of aspect-level sentiment analysis,deep learning methods can also gr... In recent years,deep learning methods have developed rapidly and found application in many fields,including natural language processing.In the field of aspect-level sentiment analysis,deep learning methods can also greatly improve the performance of models.However,previous studies did not take into account the relationship between user feature extraction and contextual terms.To address this issue,we use data feature extraction and deep learning combined to develop an aspect-level sentiment analysis method.To be specific,we design user comment feature extraction(UCFE)to distill salient features from users’historical comments and transform them into representative user feature vectors.Then,the aspect-sentence graph convolutional neural network(ASGCN)is used to incorporate innovative techniques for calculating adjacency matrices;meanwhile,ASGCN emphasizes capturing nuanced semantics within relationships among aspect words and syntactic dependency types.Afterward,three embedding methods are devised to embed the user feature vector into the ASGCN model.The empirical validations verify the effectiveness of these models,consistently surpassing conventional benchmarks and reaffirming the indispensable role of deep learning in advancing sentiment analysis methodologies. 展开更多
关键词 Aspect-level sentiment analysis deep learning graph convolutional neural network user features syntactic dependency tree
下载PDF
Aspect-Level Sentiment Analysis Incorporating Semantic and Syntactic Information
2
作者 Jiachen Yang Yegang Li +2 位作者 Hao Zhang Junpeng Hu Rujiang Bai 《Journal of Computer and Communications》 2024年第1期191-207,共17页
Aiming at the problem that existing models in aspect-level sentiment analysis cannot fully and effectively utilize sentence semantic and syntactic structure information, this paper proposes a graph neural network-base... Aiming at the problem that existing models in aspect-level sentiment analysis cannot fully and effectively utilize sentence semantic and syntactic structure information, this paper proposes a graph neural network-based aspect-level sentiment classification model. Self-attention, aspectual word multi-head attention and dependent syntactic relations are fused and the node representations are enhanced with graph convolutional networks to enable the model to fully learn the global semantic and syntactic structural information of sentences. Experimental results show that the model performs well on three public benchmark datasets Rest14, Lap14, and Twitter, improving the accuracy of sentiment classification. 展开更多
关键词 Aspect-Level Sentiment Analysis Attentional Mechanisms Dependent Syntactic trees Graph Convolutional Neural Networks
下载PDF
Fast Chinese syntactic parsing method based on conditional random fields
3
作者 韩磊 罗森林 +1 位作者 陈倩柔 潘丽敏 《Journal of Beijing Institute of Technology》 EI CAS 2015年第4期519-525,共7页
A fast method for phrase structure grammar analysis is proposed based on conditional ran- dom fields (CRF). The method trains several CRF classifiers for recognizing the phrase nodes at dif- ferent levels, and uses ... A fast method for phrase structure grammar analysis is proposed based on conditional ran- dom fields (CRF). The method trains several CRF classifiers for recognizing the phrase nodes at dif- ferent levels, and uses the bottom-up to connect the recognized phrase nodes to construct the syn- tactic tree. On the basis of Beijing forest studio Chinese tagged corpus, two experiments are de- signed to select the training parameters and verify the validity of the method. The result shows that the method costs 78. 98 ms and 4. 63 ms to train and test a Chinese sentence of 17. 9 words. The method is a new way to parse the phrase structure grammar for Chinese, and has good generalization ability and fast speed. 展开更多
关键词 phrase structure grammar syntactic tree syntactic parsing conditional random field
下载PDF
Neural Attentional Relation Extraction with Dual Dependency Trees 被引量:1
4
作者 Dong Li Zhi-Lei Lei +2 位作者 Bao-Yan Song Wan-Ting Ji Yue Kou 《Journal of Computer Science & Technology》 SCIE EI CSCD 2022年第6期1369-1381,共13页
Relation extraction has been widely used to find semantic relations between entities from plain text.Dependency trees provide deeper semantic information for relation extraction.However,existing dependency tree based ... Relation extraction has been widely used to find semantic relations between entities from plain text.Dependency trees provide deeper semantic information for relation extraction.However,existing dependency tree based models adopt pruning strategies that are too aggressive or conservative,leading to insufficient semantic information or excessive noise in relation extraction models.To overcome this issue,we propose the Neural Attentional Relation Extraction Model with Dual Dependency Trees(called DDT-REM),which takes advantage of both the syntactic dependency tree and the semantic dependency tree to well capture syntactic features and semantic features,respectively.Specifically,we first propose novel representation learning to capture the dependency relations from both syntax and semantics.Second,for the syntactic dependency tree,we propose a local-global attention mechanism to solve semantic deficits.We design an extension of graph convolutional networks(GCNs)to perform relation extraction,which effectively improves the extraction accuracy.We conduct experimental studies based on three real-world datasets.Compared with the traditional methods,our method improves the F 1 scores by 0.3,0.1 and 1.6 on three real-world datasets,respectively. 展开更多
关键词 relation extraction graph convolutional network(GCN) syntactic dependency tree semantic dependency tree
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部