期刊文献+

基于双重注意力机制的深度人脸表示算法 被引量:5

Deep Face Representation Algorithm Based onDual Attention Mechanism
下载PDF
导出
摘要 针对现有模型很少对人脸特征进行设计且人脸特征区分性较弱的问题,提出一种基于双重注意力机制的深度人脸表示算法.该算法采用双重注意力机制的网络结构,通过细节注意力机制设计低层特征,自动和自适应地学习层次特征,关注局部特征;通过语义注意力机制设计高层特征,自适应地进行语义分组,关注语义特征.在LFW,YTF,MegaFace,IJB-B和IJB-C数据集上的实验结果表明,双重注意力机制方法的识别精确度分别高达99.87%,97.9%,98.91%,95.02%和96.28%,比同类算法Groupface平均提升了0.02%,0.1%,0.2%,1%和1%,表明了双重注意力机制网络的优势. Aiming at the problem that the existing models rarely designed face features and the face features were weak in discrimination,we proposed a deep face representation algorithm based on dual attention mechanism.The algorithm adopted network structure of dual attention mechanism,designed low-level features through detail attention mechanism,and paid attention to local features through automatic and adaptive learning of hierarchical features,semantic attention mechanism was used to design high-level features,and pay attention to semantic features through adaptive semantic grouping.The experimental results on LFW,YTF,MegaFace,IJB-B and IJB-C datasets show that the recognition accuracy of the dual attention mechanism method is as high as 99.87%,97.9%,98.91%,95.02%and 96.28%respectively,which is 0.02%,0.1%,0.2%,1%and 1%higher than that of similar algorithm Groupface.The comparative experiments show the advantages of dual attention mechanism network.
作者 孙俊 才华 朱新丽 胡浩 李英超 SUN Jun;CAI Hua;ZHU Xinli;HU Hao;LI Yingchao(School of Electronic Information Engineer,Changchun University of Science and Technology,Changchun 130022,China;Changchun China Optics Science and Technology Museum,Changchun 130117,China;School of Opto-Electronic Engineer,Changchun University of Science and Technology,Changchun 130022,China)
出处 《吉林大学学报(理学版)》 CAS 北大核心 2021年第4期883-890,共8页 Journal of Jilin University:Science Edition
基金 国家自然科学基金(批准号:11275046) 吉林省科技发展计划项目(批准号:20170203005GX).
关键词 机器视觉 人脸识别 特征表示 注意力机制 machine vision face recognition feature representation attention mechanism
  • 相关文献

同被引文献23

引证文献5

二级引证文献18

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部