期刊文献+

基于增强监督知识蒸馏的交通标识分类 被引量:4

Traffic sign classification based on knowledge distillation with augmentation supervision
下载PDF
导出
摘要 为了解决神经网络对训练数据中隐藏知识利用不充分的问题,提出1种基于增强监督知识蒸馏的模型优化方法,通过设计1种新的损失函数,实现用已有教师网络的输出指导学生网络的训练,并纠正教师网络的错误。所提出方法通过再训练挖掘出隐藏的知识,可进一步提高交通标识分类的准确率。在GTSRB数据集上验证所提出方法的有效性,通过在深度卷积网络ResNet-56的对比实验证明,经过增强监督知识蒸馏方法优化后的网络相比原有网络,分类准确率提升了1.27%。 The inadequate usage on hidden knowledge is a problem of neural network on the training data. Inmization method is proposed, which is based on knowledge distillation with positive and negative supervision, A new loss function s designed guide the training of student network and corrected the error of teacher network as well, applying the output of teach-er network, The extracted hidden knowledge from retraining could further improve the accuracy of traffic sign classification. To verify the proposed method, we perform the traffic sign classification experiments on the GTSRB benchmark dataset with a Res-Net-56 convolutional networks. tt shows that the knowledge distillation based modll optimization via positive and negative super-vision gives the classification a:c^uracy rate up to 1. 27 % (compared with the origind teacher
作者 赵胜伟 葛仕明 叶奇挺 罗朝 李强 ZHAO Shengwei;GE Shiming;YE Qiting;LUO Zhao;LI Qiang(Institute of Information Engineering, Chinese Academy of Sciences, Beijing 100095, China;School of Cyber Security, University of Chinese Academy of Sciences, Beijing 100019, China;School of Information Engineering, Southwest University of Science and Technology, Mianyang, Sichuan 621010, China)
出处 《中国科技论文》 北大核心 2017年第20期2355-2360,共6页 China Sciencepaper
基金 国家重点研发计划专项(2016YFC0801005) 国家自然科学基金资助项目(61402463) 特殊环境机器人技术四川省重点实验室开放基金资助项目(16KFTK01)
关键词 深度学习 知识蒸馏 模型优化 监督学习 交通标识分类 deep learning knowledge distillation model optimization supervised learning traffic sign
  • 相关文献

同被引文献50

引证文献4

二级引证文献21

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部