期刊文献+
共找到4篇文章
< 1 >
每页显示 20 50 100
Neural Attentional Relation Extraction with Dual Dependency Trees
1
作者 李冬 雷智磊 +2 位作者 宋宝燕 纪婉婷 寇月 《Journal of Computer Science & Technology》 SCIE EI CSCD 2022年第6期1369-1381,共13页
Relation extraction has been widely used to find semantic relations between entities from plain text.Dependency trees provide deeper semantic information for relation extraction.However,existing dependency tree based ... Relation extraction has been widely used to find semantic relations between entities from plain text.Dependency trees provide deeper semantic information for relation extraction.However,existing dependency tree based models adopt pruning strategies that are too aggressive or conservative,leading to insufficient semantic information or excessive noise in relation extraction models.To overcome this issue,we propose the Neural Attentional Relation Extraction Model with Dual Dependency Trees(called DDT-REM),which takes advantage of both the syntactic dependency tree and the semantic dependency tree to well capture syntactic features and semantic features,respectively.Specifically,we first propose novel representation learning to capture the dependency relations from both syntax and semantics.Second,for the syntactic dependency tree,we propose a local-global attention mechanism to solve semantic deficits.We design an extension of graph convolutional networks(GCNs)to perform relation extraction,which effectively improves the extraction accuracy.We conduct experimental studies based on three real-world datasets.Compared with the traditional methods,our method improves the F 1 scores by 0.3,0.1 and 1.6 on three real-world datasets,respectively. 展开更多
关键词 relation extraction graph convolutional network(GCN) syntactic dependency tree semantic dependency tree
原文传递
Aspect-Level Sentiment Analysis Based on Deep Learning
2
作者 Mengqi Zhang Jiazhao Chai +2 位作者 Jianxiang Cao Jialing Ji Tong Yi 《Computers, Materials & Continua》 SCIE EI 2024年第3期3743-3762,共20页
In recent years,deep learning methods have developed rapidly and found application in many fields,including natural language processing.In the field of aspect-level sentiment analysis,deep learning methods can also gr... In recent years,deep learning methods have developed rapidly and found application in many fields,including natural language processing.In the field of aspect-level sentiment analysis,deep learning methods can also greatly improve the performance of models.However,previous studies did not take into account the relationship between user feature extraction and contextual terms.To address this issue,we use data feature extraction and deep learning combined to develop an aspect-level sentiment analysis method.To be specific,we design user comment feature extraction(UCFE)to distill salient features from users’historical comments and transform them into representative user feature vectors.Then,the aspect-sentence graph convolutional neural network(ASGCN)is used to incorporate innovative techniques for calculating adjacency matrices;meanwhile,ASGCN emphasizes capturing nuanced semantics within relationships among aspect words and syntactic dependency types.Afterward,three embedding methods are devised to embed the user feature vector into the ASGCN model.The empirical validations verify the effectiveness of these models,consistently surpassing conventional benchmarks and reaffirming the indispensable role of deep learning in advancing sentiment analysis methodologies. 展开更多
关键词 Aspect-level sentiment analysis deep learning graph convolutional neural network user features syntactic dependency tree
下载PDF
Graph Convolutional Networks Embedding Textual Structure Information for Relation Extraction
3
作者 Chuyuan Wei Jinzhe Li +2 位作者 Zhiyuan Wang Shanshan Wan Maozu Guo 《Computers, Materials & Continua》 SCIE EI 2024年第5期3299-3314,共16页
Deep neural network-based relational extraction research has made significant progress in recent years,andit provides data support for many natural language processing downstream tasks such as building knowledgegraph,... Deep neural network-based relational extraction research has made significant progress in recent years,andit provides data support for many natural language processing downstream tasks such as building knowledgegraph,sentiment analysis and question-answering systems.However,previous studies ignored much unusedstructural information in sentences that could enhance the performance of the relation extraction task.Moreover,most existing dependency-based models utilize self-attention to distinguish the importance of context,whichhardly deals withmultiple-structure information.To efficiently leverage multiple structure information,this paperproposes a dynamic structure attention mechanism model based on textual structure information,which deeplyintegrates word embedding,named entity recognition labels,part of speech,dependency tree and dependency typeinto a graph convolutional network.Specifically,our model extracts text features of different structures from theinput sentence.Textual Structure information Graph Convolutional Networks employs the dynamic structureattention mechanism to learn multi-structure attention,effectively distinguishing important contextual features invarious structural information.In addition,multi-structure weights are carefully designed as amergingmechanismin the different structure attention to dynamically adjust the final attention.This paper combines these featuresand trains a graph convolutional network for relation extraction.We experiment on supervised relation extractiondatasets including SemEval 2010 Task 8,TACRED,TACREV,and Re-TACED,the result significantly outperformsthe previous. 展开更多
关键词 Relation extraction graph convolutional neural networks dependency tree dynamic structure attention
下载PDF
Aspect-Level Sentiment Analysis Incorporating Semantic and Syntactic Information
4
作者 Jiachen Yang Yegang Li +2 位作者 Hao Zhang Junpeng Hu Rujiang Bai 《Journal of Computer and Communications》 2024年第1期191-207,共17页
Aiming at the problem that existing models in aspect-level sentiment analysis cannot fully and effectively utilize sentence semantic and syntactic structure information, this paper proposes a graph neural network-base... Aiming at the problem that existing models in aspect-level sentiment analysis cannot fully and effectively utilize sentence semantic and syntactic structure information, this paper proposes a graph neural network-based aspect-level sentiment classification model. Self-attention, aspectual word multi-head attention and dependent syntactic relations are fused and the node representations are enhanced with graph convolutional networks to enable the model to fully learn the global semantic and syntactic structural information of sentences. Experimental results show that the model performs well on three public benchmark datasets Rest14, Lap14, and Twitter, improving the accuracy of sentiment classification. 展开更多
关键词 Aspect-Level Sentiment Analysis Attentional Mechanisms Dependent Syntactic trees Graph Convolutional Neural Networks
下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部