期刊文献+

基于知识蒸馏和高效通道注意力的异常检测 被引量:1

Novelty Detection Method Based on Knowledge Distillation and Efficient Channel Attention
下载PDF
导出
摘要 基于知识蒸馏的异常检测方法通常将经过预训练的网络作为教师网络,并将与该教师网络的模型结构及规模大小相同的网络用作学生网络,对于待测数据,利用教师网络与学生网络之间的差异判定其为正常数据或异常数据。然而,教师网络与学生网络的结构和规模均相同,一方面,会使得基于知识蒸馏的异常检测方法在异常数据上产生的差异过小;另一方面,教师网络的预训练数据集在规模上远大于学生网络的训练集,这会使得学生网络产生大量的冗余信息。为了解决上述问题,将高效通道注意力(Efficient Channel Attention,ECA)模块引入到基于知识蒸馏的异常检测方法中,利用ECA的跨通道交互策略,设计比教师网络结构更简单且规模更小的学生网络,既可以有效地获取正常数据的特征,去除冗余信息,又能增大教师网络与学生网络之间的差异,提高异常检测的性能。在6个图像数据集上的实验结果表明,与其他5种相关方法相比,所提方法取得了更优的检测性能。 The knowledge distillation based novelty detection method usually utilizes the pre-trained network as the teacher network.The network that has the same model structure and size as the teacher network is used as the student network.For testing data,the difference between the teacher network and the student network is utilized to discriminate them as normal or novel.However,the teacher network and the student network have the same network structure and size.On the one hand,the know-ledge distillation based novelty detection method may produce a small difference in the novel data.On the other hand,because the pre-trained data set of the teacher network is much larger in scale than the training set of the student network,the student network may thus obtain lots of redundant information.To solve this problem,the efficient channel attention(ECA)module is introduced into the knowledge distillation based novelty detection method.Utilizing the cross-channel interaction strategy,the student network with a simpler network structure and smaller size in comparison with the teacher network is designed.Hence,the features of the normal data can be efficiently obtained.The redundant information may be removed.The difference between the teacher network and the student network can also be enlarged.Moreover,the novelty detection performance may be improved.In comparison with 5 related methods,experimental results on the 6 image data sets demonstrate that the proposed method obtains better detection performance.
作者 周士金 邢红杰 ZHOU Shijin;XING Hongjie(Hebei Key Laboratory of Machine Learning and Computational Intelligence,College of Mathematics and Information Science,Hebei University,Baoding,Hebei 071002,China)
出处 《计算机科学》 CSCD 北大核心 2023年第S02期577-586,共10页 Computer Science
基金 国家自然科学基金(61672205) 河北省自然科学基金(F2017201020) 河北大学高层次人才科研启动项目(521100222002) 河北大学附属医院基金项目(2019Q003) 复杂能源系统智能计算教育部工程研究中心开放基金(ESIC202101)。
关键词 异常检测 知识蒸馏 注意力机制 教师网络 学生网络 Novelty detection Knowledge distillation Attention mechanism Teacher network Student network
  • 相关文献

同被引文献14

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部