To alleviate the problem of under-utilization features of sentence-level relation extraction,which leads to insufficient performance of the pre-trained language model and underutilization of the feature vector,a sente...To alleviate the problem of under-utilization features of sentence-level relation extraction,which leads to insufficient performance of the pre-trained language model and underutilization of the feature vector,a sentence-level relation extraction method based on adding prompt information and feature reuse is proposed.At first,in addition to the pair of nominals and sentence information,a piece of prompt information is added,and the overall feature information consists of sentence information,entity pair information,and prompt information,and then the features are encoded by the pre-trained language model ROBERTA.Moreover,in the pre-trained language model,BIGRU is also introduced in the composition of the neural network to extract information,and the feature information is passed through the neural network to form several sets of feature vectors.After that,these feature vectors are reused in different combinations to form multiple outputs,and the outputs are aggregated using ensemble-learning soft voting to perform relation classification.In addition to this,the sum of cross-entropy,KL divergence,and negative log-likelihood loss is used as the final loss function in this paper.In the comparison experiments,the model based on adding prompt information and feature reuse achieved higher results of the SemEval-2010 task 8 relational dataset.展开更多
基金supported by the project of the Ministry of Education(research on the construction and application of quantum encryption cloud service system based on big data analysis of web content(2019JB328L06))the scientific research planning project of the jilin Provincial Education Department(construction and application of medical knowledge graph for chronic diseases of the elderly(JKH20210614KJ)).
文摘To alleviate the problem of under-utilization features of sentence-level relation extraction,which leads to insufficient performance of the pre-trained language model and underutilization of the feature vector,a sentence-level relation extraction method based on adding prompt information and feature reuse is proposed.At first,in addition to the pair of nominals and sentence information,a piece of prompt information is added,and the overall feature information consists of sentence information,entity pair information,and prompt information,and then the features are encoded by the pre-trained language model ROBERTA.Moreover,in the pre-trained language model,BIGRU is also introduced in the composition of the neural network to extract information,and the feature information is passed through the neural network to form several sets of feature vectors.After that,these feature vectors are reused in different combinations to form multiple outputs,and the outputs are aggregated using ensemble-learning soft voting to perform relation classification.In addition to this,the sum of cross-entropy,KL divergence,and negative log-likelihood loss is used as the final loss function in this paper.In the comparison experiments,the model based on adding prompt information and feature reuse achieved higher results of the SemEval-2010 task 8 relational dataset.