卷积神经网络性能的快速提升是以不断堆叠的网络层数以及成倍增长的参数量和存储空间为代价,这不仅会使模型在训练过程中出现过拟合等问题,也不利于模型在资源受限的嵌入式设备上运行,因而提出模型压缩技术来解决上述问题,主要对模型压...卷积神经网络性能的快速提升是以不断堆叠的网络层数以及成倍增长的参数量和存储空间为代价,这不仅会使模型在训练过程中出现过拟合等问题,也不利于模型在资源受限的嵌入式设备上运行,因而提出模型压缩技术来解决上述问题,主要对模型压缩技术中的特征蒸馏算法进行了研究。针对特征蒸馏中利用教师网络特征图指导学生网络并不能很好地锻炼学生网络特征拟合能力的问题,提出基于特征分布蒸馏算法。该算法利用条件互信息的概念构建模型特征空间的概率分布,并引入最大平均差异(maximum mean discrepancy,MMD)设计损失函数以最小化教师网络和学生网络特征分布间的距离。在知识蒸馏的基础上利用toeplitz矩阵对学生网络进行权重共享操作,进一步节省了模型的存储空间。为验证在特征分布蒸馏算法训练下学生网络的特征拟合能力,在图像分类、目标检测和语义分割三种图像处理任务上进行了实验验证,实验表明所提算法在以上三种学习任务中的表现均优于对比算法且实现了不同网络架构间的蒸馏。展开更多
In recent years,anomaly detection has attracted much attention in industrial production.As traditional anomaly detection methods usually rely on direct comparison of samples,they often ignore the intrinsic relationshi...In recent years,anomaly detection has attracted much attention in industrial production.As traditional anomaly detection methods usually rely on direct comparison of samples,they often ignore the intrinsic relationship between samples,resulting in poor accuracy in recognizing anomalous samples.To address this problem,a knowledge distillation anomaly detection method based on feature reconstruction was proposed in this study.Knowledge distillation was performed after inverting the structure of the teacher-student network to avoid the teacher-student network sharing the same inputs and similar structure.Representability was improved by using feature splicing to unify features at different levels,and the merged features were processed and reconstructed using an improved Transformer.The experimental results show that the proposed method achieves better performance on the MVTec dataset,verifying its effectiveness and feasibility in anomaly detection tasks.This study provides a new idea to improve the accuracy and efficiency of anomaly detection.展开更多
文摘卷积神经网络性能的快速提升是以不断堆叠的网络层数以及成倍增长的参数量和存储空间为代价,这不仅会使模型在训练过程中出现过拟合等问题,也不利于模型在资源受限的嵌入式设备上运行,因而提出模型压缩技术来解决上述问题,主要对模型压缩技术中的特征蒸馏算法进行了研究。针对特征蒸馏中利用教师网络特征图指导学生网络并不能很好地锻炼学生网络特征拟合能力的问题,提出基于特征分布蒸馏算法。该算法利用条件互信息的概念构建模型特征空间的概率分布,并引入最大平均差异(maximum mean discrepancy,MMD)设计损失函数以最小化教师网络和学生网络特征分布间的距离。在知识蒸馏的基础上利用toeplitz矩阵对学生网络进行权重共享操作,进一步节省了模型的存储空间。为验证在特征分布蒸馏算法训练下学生网络的特征拟合能力,在图像分类、目标检测和语义分割三种图像处理任务上进行了实验验证,实验表明所提算法在以上三种学习任务中的表现均优于对比算法且实现了不同网络架构间的蒸馏。
文摘In recent years,anomaly detection has attracted much attention in industrial production.As traditional anomaly detection methods usually rely on direct comparison of samples,they often ignore the intrinsic relationship between samples,resulting in poor accuracy in recognizing anomalous samples.To address this problem,a knowledge distillation anomaly detection method based on feature reconstruction was proposed in this study.Knowledge distillation was performed after inverting the structure of the teacher-student network to avoid the teacher-student network sharing the same inputs and similar structure.Representability was improved by using feature splicing to unify features at different levels,and the merged features were processed and reconstructed using an improved Transformer.The experimental results show that the proposed method achieves better performance on the MVTec dataset,verifying its effectiveness and feasibility in anomaly detection tasks.This study provides a new idea to improve the accuracy and efficiency of anomaly detection.