期刊文献+

基于特征融合的无标复句关系识别

Relation Recognition of Unmarked Complex Sentences Based on Feature Fusion
下载PDF
导出
摘要 无标复句因缺少关联词的辅助,其关系识别为自然语言处理中的一项较为困难的任务。将词性特征融入到词向量中,训练得到含有外部特征的词向量表示,通过组合BERT模型与BiLSTM模型,将字向量、词向量、词性向量结合进行训练,并在特征融合层添加BiLSTM模型捕获的极性特征信息以及CNN模型捕获的依存句法特征信息。实验结果表明,该方法在汉语复句分类上取得了较好的效果,与基准模型相比在宏F1值与微F1值上均有提升,在顶层分类上取得了83.67%的微F1值,在第二层分类上取得了68.28%的微F1值。 Unlike marked complex sentences,which lack the assistance of relation words,the identification of unmarked complex sentences is a difficult task in natural language processing.Integrating part of speech features into word vectors,and the word vector representation containing external features is obtained by training.By combining the BERT model and the BiLSTM model,the word vector and the part-of-speech vector are combined for training,and the polar feature information captured by BiLSTM model and the dependency syntax feature information captured by CNN model are added to the feature fusion layer.Experimental results show that the methods of adding features and combining multiple deep learning models can achieve better results in classification of Chinese complex sentences.Compared with the benchmark model,the macro F1 value and micro F1 value are improved.The best classification effect achieves 83.67%micro F1 value in the top layer classification and 68.28%micro F1 value in the second layer classification.
作者 杨进才 马晨 肖明 YANG Jincai;MA Chen;XIAO Ming(School of Computer Science,Central China Normal University,Wuhan 430079,China;Research Center for Language and Language Education,Central China Normal University,Wuhan 430079,China)
出处 《计算机科学》 CSCD 北大核心 2023年第S02期57-62,共6页 Computer Science
基金 国家社科基金(19BYY092)。
关键词 无标复句 BERT 特征融合 深度学习 Unmarked complex sentence BERT Feature fusion Deep learning
  • 相关文献

参考文献10

二级参考文献59

  • 1周强.汉语句法树库标注体系[J].中文信息学报,2004,18(4):1-8. 被引量:90
  • 2谢蓓.对复句分类研究的回顾与思考[J].重庆科技学院学报(社会科学版),2006(6):119-121. 被引量:6
  • 3李晋霞,刘云.复句类型的演变[J].汉语学习,2007(2):20-26. 被引量:11
  • 4邵敬敏.建立以语义特征为标志的汉语复句教学新系统刍议[J].世界汉语教学,2007,21(4):94-104. 被引量:37
  • 5Wang Fei ,Wu Yunfang, Qiu Likun. 2012. Exploiting dis- course relations for sentiment analysis [ EB/OL]. [2014- 11-19 ]. http ://www. aclweb, org/anthology/C/C12/C12- 2128. pdf.
  • 6Lin Ziheng, HweeTou Ng, Min-Yen Kan. Automatically evaluating text coherence using discourse relations [ Eli/ OL]. [ 2014-11-19 ]. http://citeseerx, ist. psu. edu/view- deersummary? do/= 10. 1.1. 207.6356.
  • 7Soricut, Radu, Daniel Marcu. Sentence level discourse par- sing using syntactic and lexical information [ EB/OL]. [2014-12-23 ]. http://citeseerx, ist. psu. edu/viewdoc/ summary? doi = 10.1.1.13. 4284.
  • 8Miltsakaki, Eleni, Rashmi Prasad, et al. The penn dis- course Treebank [ EB/OL ]. [ 2014-11-20 ]. http ://cite- seerx, ist. psu. edu/viewdoc/download? doi = 10. 1.1.3. 9607&rep = repl&type = pdf.
  • 9Xue Nianwen. Annotating discourse connectives in the Chinese Treebank [ EB/OL ]. [ 2014-12-21 ]. http ://cite- seerx, ist. psu. edu/viewdoc/summary? doi = 10. 1. 1. 139. 6457.
  • 10Xue Nianwen, Yang Yaqin. Chinese sentence segmenta- tion as comma classification [ EB/OL]. [ 2014-12-23 ]. hnp://dl, acm. org/citation, cfm? id =2002859.

共引文献59

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部