期刊文献+

基于Transformer和多特征融合的DGA域名检测方法 被引量:4

A DGA domain name detection method based on Transformer and multi-feature fusion
下载PDF
导出
摘要 针对域名生成算法生成的恶意域名隐蔽性高,现有方法在恶意域名检测上准确率不高的问题,提出一种基于Transformer和多特征融合的DGA域名检测方法。该方法使用Transformer编码器捕获域名字符的全局信息,通过并行深度卷积神经网络获取不同粒度的长距离上下文特征,同时引入双向长短期记忆网络BiLSTM和自注意力机制Self-Attention结合浅层CNN得到浅层时空特征,融合长距离上下文特征和浅层时空特征进行DGA域名检测。实验结果表明,所提方法在恶意域名检测方法上有更好的性能。相对于CNN、LSTM、L-PCAL和SW-DRN,所提方法在二分类实验中准确率分别提升了1.72%,1.10%,0.75%和0.34%;在多分类实验中准确率分别提升了1.75%,1.29%,0.88%和0.83%。 To address the problem of high concealment of malicious domain names generated by domain generation algorithms(DGAs)and low accuracy of existing methods in multi-classification of malicious domain names,a DGA domain name detection method based on Transformer and multi-feature fusion is proposed.The method uses the Transformer encoder to capture the global information of domain name characters,and obtains long-distance contextual features at different granularities through a parallel deep convolutional neural network(DCNN).At the same time,BiLSTM and self-attention mechanism are introduced to combine shallow CNN to obtain shallow spatiotemporal features.Finally,the long-distance context features and shallow spatiotemporal features are combined for domain name detection.The experimental results show that the proposed method has better performance in malicious domain name detection.Compared with CNN,LSTM,L-PCAL,and SW-DRN,the proposed method improves the accuracy by 1.72%,1.10%,0.75%,and 0.34%in the binary classification experiment and by 1.75%,1.29%,0.88%,and 0.83%in the multi-classification experiment.
作者 余子丞 凌捷 YU Zi-cheng;LING Jie(School of Computer Science and Technology,Guangdong University of Technology,Guangzhou 510006,China)
出处 《计算机工程与科学》 CSCD 北大核心 2023年第8期1416-1423,共8页 Computer Engineering & Science
基金 广东省重点领域研发计划(2019B010139002) 广州市重点领域研发计划(202007010004)。
关键词 域名生成算法 Transformer模型 深度卷积神经网络 双向长短期记忆网络 自注意力机制 domain generation algorithm(DGA) Transformer model deep convolutional neural network(DCNN) Bidirectional long short-term memory network self-attention mechanism
  • 相关文献

参考文献4

二级参考文献8

共引文献32

同被引文献38

引证文献4

二级引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部