In recent years,deep learning methods have developed rapidly and found application in many fields,including natural language processing.In the field of aspect-level sentiment analysis,deep learning methods can also gr...In recent years,deep learning methods have developed rapidly and found application in many fields,including natural language processing.In the field of aspect-level sentiment analysis,deep learning methods can also greatly improve the performance of models.However,previous studies did not take into account the relationship between user feature extraction and contextual terms.To address this issue,we use data feature extraction and deep learning combined to develop an aspect-level sentiment analysis method.To be specific,we design user comment feature extraction(UCFE)to distill salient features from users’historical comments and transform them into representative user feature vectors.Then,the aspect-sentence graph convolutional neural network(ASGCN)is used to incorporate innovative techniques for calculating adjacency matrices;meanwhile,ASGCN emphasizes capturing nuanced semantics within relationships among aspect words and syntactic dependency types.Afterward,three embedding methods are devised to embed the user feature vector into the ASGCN model.The empirical validations verify the effectiveness of these models,consistently surpassing conventional benchmarks and reaffirming the indispensable role of deep learning in advancing sentiment analysis methodologies.展开更多
Aiming at the problem that existing models in aspect-level sentiment analysis cannot fully and effectively utilize sentence semantic and syntactic structure information, this paper proposes a graph neural network-base...Aiming at the problem that existing models in aspect-level sentiment analysis cannot fully and effectively utilize sentence semantic and syntactic structure information, this paper proposes a graph neural network-based aspect-level sentiment classification model. Self-attention, aspectual word multi-head attention and dependent syntactic relations are fused and the node representations are enhanced with graph convolutional networks to enable the model to fully learn the global semantic and syntactic structural information of sentences. Experimental results show that the model performs well on three public benchmark datasets Rest14, Lap14, and Twitter, improving the accuracy of sentiment classification.展开更多
A fast method for phrase structure grammar analysis is proposed based on conditional ran- dom fields (CRF). The method trains several CRF classifiers for recognizing the phrase nodes at dif- ferent levels, and uses ...A fast method for phrase structure grammar analysis is proposed based on conditional ran- dom fields (CRF). The method trains several CRF classifiers for recognizing the phrase nodes at dif- ferent levels, and uses the bottom-up to connect the recognized phrase nodes to construct the syn- tactic tree. On the basis of Beijing forest studio Chinese tagged corpus, two experiments are de- signed to select the training parameters and verify the validity of the method. The result shows that the method costs 78. 98 ms and 4. 63 ms to train and test a Chinese sentence of 17. 9 words. The method is a new way to parse the phrase structure grammar for Chinese, and has good generalization ability and fast speed.展开更多
Relation extraction has been widely used to find semantic relations between entities from plain text.Dependency trees provide deeper semantic information for relation extraction.However,existing dependency tree based ...Relation extraction has been widely used to find semantic relations between entities from plain text.Dependency trees provide deeper semantic information for relation extraction.However,existing dependency tree based models adopt pruning strategies that are too aggressive or conservative,leading to insufficient semantic information or excessive noise in relation extraction models.To overcome this issue,we propose the Neural Attentional Relation Extraction Model with Dual Dependency Trees(called DDT-REM),which takes advantage of both the syntactic dependency tree and the semantic dependency tree to well capture syntactic features and semantic features,respectively.Specifically,we first propose novel representation learning to capture the dependency relations from both syntax and semantics.Second,for the syntactic dependency tree,we propose a local-global attention mechanism to solve semantic deficits.We design an extension of graph convolutional networks(GCNs)to perform relation extraction,which effectively improves the extraction accuracy.We conduct experimental studies based on three real-world datasets.Compared with the traditional methods,our method improves the F 1 scores by 0.3,0.1 and 1.6 on three real-world datasets,respectively.展开更多
基金This work is partly supported by the Fundamental Research Funds for the Central Universities(CUC230A013)It is partly supported by Natural Science Foundation of Beijing Municipality(No.4222038)It is also supported by National Natural Science Foundation of China(Grant No.62176240).
文摘In recent years,deep learning methods have developed rapidly and found application in many fields,including natural language processing.In the field of aspect-level sentiment analysis,deep learning methods can also greatly improve the performance of models.However,previous studies did not take into account the relationship between user feature extraction and contextual terms.To address this issue,we use data feature extraction and deep learning combined to develop an aspect-level sentiment analysis method.To be specific,we design user comment feature extraction(UCFE)to distill salient features from users’historical comments and transform them into representative user feature vectors.Then,the aspect-sentence graph convolutional neural network(ASGCN)is used to incorporate innovative techniques for calculating adjacency matrices;meanwhile,ASGCN emphasizes capturing nuanced semantics within relationships among aspect words and syntactic dependency types.Afterward,three embedding methods are devised to embed the user feature vector into the ASGCN model.The empirical validations verify the effectiveness of these models,consistently surpassing conventional benchmarks and reaffirming the indispensable role of deep learning in advancing sentiment analysis methodologies.
文摘Aiming at the problem that existing models in aspect-level sentiment analysis cannot fully and effectively utilize sentence semantic and syntactic structure information, this paper proposes a graph neural network-based aspect-level sentiment classification model. Self-attention, aspectual word multi-head attention and dependent syntactic relations are fused and the node representations are enhanced with graph convolutional networks to enable the model to fully learn the global semantic and syntactic structural information of sentences. Experimental results show that the model performs well on three public benchmark datasets Rest14, Lap14, and Twitter, improving the accuracy of sentiment classification.
基金Supported by the Science and Technology Innovation Plan of Beijing Institute of Technology(2013)
文摘A fast method for phrase structure grammar analysis is proposed based on conditional ran- dom fields (CRF). The method trains several CRF classifiers for recognizing the phrase nodes at dif- ferent levels, and uses the bottom-up to connect the recognized phrase nodes to construct the syn- tactic tree. On the basis of Beijing forest studio Chinese tagged corpus, two experiments are de- signed to select the training parameters and verify the validity of the method. The result shows that the method costs 78. 98 ms and 4. 63 ms to train and test a Chinese sentence of 17. 9 words. The method is a new way to parse the phrase structure grammar for Chinese, and has good generalization ability and fast speed.
基金the National Science and Technology Major Project of the Ministry of Science and Technology of China(Secret 501).
文摘Relation extraction has been widely used to find semantic relations between entities from plain text.Dependency trees provide deeper semantic information for relation extraction.However,existing dependency tree based models adopt pruning strategies that are too aggressive or conservative,leading to insufficient semantic information or excessive noise in relation extraction models.To overcome this issue,we propose the Neural Attentional Relation Extraction Model with Dual Dependency Trees(called DDT-REM),which takes advantage of both the syntactic dependency tree and the semantic dependency tree to well capture syntactic features and semantic features,respectively.Specifically,we first propose novel representation learning to capture the dependency relations from both syntax and semantics.Second,for the syntactic dependency tree,we propose a local-global attention mechanism to solve semantic deficits.We design an extension of graph convolutional networks(GCNs)to perform relation extraction,which effectively improves the extraction accuracy.We conduct experimental studies based on three real-world datasets.Compared with the traditional methods,our method improves the F 1 scores by 0.3,0.1 and 1.6 on three real-world datasets,respectively.