There is a growing amount of data uploaded to the internet every day and it is important to understand the volume of those data to find a better scheme to process them.However,the volume of internet data is beyond the...There is a growing amount of data uploaded to the internet every day and it is important to understand the volume of those data to find a better scheme to process them.However,the volume of internet data is beyond the processing capabilities of the current internet infrastructure.Therefore,engineering works using technology to organize and analyze information and extract useful information are interesting in both industry and academia.The goal of this paper is to explore the entity relationship based on deep learning,introduce semantic knowledge by using the prepared language model,develop an advanced entity relationship information extraction method by combining Robustly Optimized BERT Approach(RoBERTa)and multi-task learning,and combine the intelligent characters in the field of linguistic,called Robustly Optimized BERT Approach+Multi-Task Learning(RoBERTa+MTL).To improve the effectiveness of model interaction,multi-task teaching is used to implement the observation information of auxiliary tasks.Experimental results show that our method has achieved an accuracy of 88.95 entity relationship extraction,and a further it has achieved 86.35%of accuracy after being combined with multi-task learning.展开更多
文摘There is a growing amount of data uploaded to the internet every day and it is important to understand the volume of those data to find a better scheme to process them.However,the volume of internet data is beyond the processing capabilities of the current internet infrastructure.Therefore,engineering works using technology to organize and analyze information and extract useful information are interesting in both industry and academia.The goal of this paper is to explore the entity relationship based on deep learning,introduce semantic knowledge by using the prepared language model,develop an advanced entity relationship information extraction method by combining Robustly Optimized BERT Approach(RoBERTa)and multi-task learning,and combine the intelligent characters in the field of linguistic,called Robustly Optimized BERT Approach+Multi-Task Learning(RoBERTa+MTL).To improve the effectiveness of model interaction,multi-task teaching is used to implement the observation information of auxiliary tasks.Experimental results show that our method has achieved an accuracy of 88.95 entity relationship extraction,and a further it has achieved 86.35%of accuracy after being combined with multi-task learning.