期刊文献+

基于无标签知识蒸馏的人脸识别模型的压缩算法 被引量:5

Compression Algorithm of Face Recognition Model Based on Unlabeled Knowledge Distillation
下载PDF
导出
摘要 将人脸识别技术移植到移动设备上时,往往需要经过模型压缩等加速算法的处理。知识蒸馏是一种实际应用较广且易于训练的模型压缩方法,现有的知识蒸馏算法需要大量带标签的人脸数据,可能会涉及身份隐私泄露等安全问题。同时,大规模采集有标签人脸数据的成本较大,而海量可采集或生成的无标签人脸数据却无法利用。为解决上述问题,通过分析知识蒸馏在人脸识别任务中的特性,提出了一种无标签知识蒸馏的间接监督训练方法。该方法可以利用海量无标签的人脸数据,避免了隐私泄露等安全隐患问题。然而,无标签人脸数据集的数据分布无法预知,存在数据分布不均衡的问题,阻碍了间接监督算法的性能提升。文中进一步提出了一种人脸内容置换的数据增强方法,通过置换人脸部分内容来平衡人脸数据分布,同时增强了人脸数据的多样性。实验结果表明,人脸识别模型被大幅度压缩时,所提算法的性能达到了先进水平,并在LFW数据集上超越了大型网络。 When transplanting face recognition technology to mobile devices,it often needs to be processed by accelerated algorithms such as model compression.Knowledge distillation is a model compression method that has a wide range of practical applications and is easy to train.Existing knowledge distillation algorithms require a large amount of tagged face data,which may involve security issues such as identity privacy leakage.At the same time,the cost of large-scale collection of tagged face data is relatively high,while the massive amount of unlabeled face data that can be collected or generated cannot be used.In order to solve the above problems,this paper analyzes the characteristics of knowledge distillation in face recognition tasks,and proposes an indirect supervised training method of unlabeled knowledge distillation.This method can utilize massive amounts of unlabeled face data,thereby avoiding security risks such as privacy leakage.However,the data distribution of the unlabeled face data set is unpredictable,and there is the problem of uneven data distribution,which limits the performance of the indirect supervision algorithm.This research further proposes a data enhancement method for face content replacement,which balances the distribution of face data by replacing part of the content of the face,and at the same time enhances the diversity of face data.Sufficient experimental results show that when the face recognition model is greatly compressed,the performance of the algorithm in this research reaches an advanced level,and surpasses the large-scale network on the LFW data set.
作者 程祥鸣 邓春华 CHENG Xiang-ming;DENG Chun-hua(College of Computer Science and Technology,Wuhan University of Science and Technology,Wuhan 430065,China;Institute of Big Data Science and Engineering,Wuhan University of Science and Technology,Wuhan 430065,China;Hubei key Laboratory of Intelligent Information Processing and Real-time Industrial System,Wuhan University of Science and Technology,Wuhan 430065,China)
出处 《计算机科学》 CSCD 北大核心 2022年第6期245-253,共9页 Computer Science
基金 国家自然科学基金(61806150)。
关键词 人脸识别 知识蒸馏 模型压缩 间接监督 内容置换 Face recognition Knowledge distillation Model compression Indirect supervision Content replacement
  • 相关文献

参考文献4

二级参考文献26

共引文献43

同被引文献39

引证文献5

二级引证文献2

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部