期刊文献+

融合对抗主动学习的网络安全知识三元组抽取 被引量:6

Knowledge triple extraction in cybersecurity with adversarial active learning
下载PDF
导出
摘要 针对当前网络安全领域知识获取中所依赖的流水线模式存在实体识别错误的传播,未考虑实体识别与关系抽取任务间的联系,以及模型训练缺乏标签语料的问题,提出一种融合对抗主动学习的端到端网络安全知识三元组抽取方法。首先,将实体识别与关系抽取通过联合标注策略建模为序列标注任务;然后,设计融合动态注意力机制的Bi LSTM-LSTM模型实现实体与关系的联合抽取,并形成三元组;最后,基于对抗网络训练一个判别器模型,增量地筛选出高质量的待标注数据进行标注,并通过迭代训练不断提升联合抽取模型的性能。通过实验表明,所提方案中实体-关系联合抽取模型优于现有的网络安全知识抽取方案,并验证了对抗主动学习方法的有效性。 Aiming at the problem that using pipeline methods for extracting cybersecurity knowledge triples may cause the errors propagation of entity recognition and did not consider the correlation between entity recognition and relation extraction,and training triple extraction model lacked labeled corpora,an end-to-end cybersecurity knowledge triple extraction method with adversarial active learning was proposed.For knowledge triple extraction,the conventional entity recognition and relation extraction were modelled as sequence labeling task through joint labeling strategy firstly.And then,a BiLSTM-LSTM-based model with dynamic attention mechanism was designed to jointly extract entities and relations,forming triples.Finally,with adversarial learning framework,a discriminator was trained to incrementally select high-quality samples for labeling,and the performance of the joint extraction model was continuously enhanced by iterative retraining.Experiments show that the proposed joint extraction model outperforms the existing cybersecurity knowledge triple extraction methods,and demonstrate the effectiveness of proposed adversarial active learning scheme.
作者 李涛 郭渊博 琚安康 LI Tao;GUO Yuanbo;JU Ankang(Department of Cryptogram Engineering,Information Engineering University,Zhengzhou 450001,China)
出处 《通信学报》 EI CSCD 北大核心 2020年第10期80-91,共12页 Journal on Communications
基金 国家自然科学基金资助项目(No.61501515)。
关键词 知识三元组 网络安全 联合抽取 对抗网络 主动学习 knowledge triple cybersecurity joint extraction adversarial network active learning
  • 相关文献

参考文献6

二级参考文献25

共引文献217

同被引文献43

引证文献6

二级引证文献24

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部