期刊文献+

引入注意力机制和中心损失的表情识别算法 被引量:4

Expression recognition algorithm introducing attention mechanism and center loss
下载PDF
导出
摘要 针对表情识别领域中识别准确率不高的问题,以卷积神经网络为基础,提出一种引入特征通道注意力机制和中心损失函数的表情识别算法。以VGG11作为网络主体框架,引入特征通道注意力机制,在每一层卷积层之后增加压缩激励(SE)模块,使模型学习到不同特征通道的重要程度,从而提高模型对特征通道的敏感性,进一步提升模型的特征表示能力。同时,通过中心损失函数联合Soft Max损失函数对神经网络进行监督训练,有效增加类间距离,保证类内紧密性,提高模型的识别准确率。通过在CK+数据集上测试,实验结果表明:改进模型的识别准确率达到97. 07%,高于其他典型算法。 Aiming at the problem that the recognition accuracy is not high in the field of expression recognition,an expression recognition algorithm based on the convolutional neural network and introducing the attention mechanism of the feature channel and the central loss function is proposed. Using VGG11 as the main frame of the network,the feature channel attention mechanism is introduced. After each convolutional layer,a squeeze-andexcitation( SE) block is added to make the model learn the importance of different feature channels,which improves the sensitivity of the model to the feature channel and further enhance the feature representation ability of the model. At the same time,the neural network is supervised and trained by the center loss function combined with the Soft Max loss function,which effectively increases the distance between inter-classes,ensures the tightness of the intra-class,and improves the recognition accuracy of the model. By testing on the CK + dataset,the experimental results show that the recognition accuracy of the improved model reaches 97. 07 %,which is higher than other typical algorithms.
作者 张翔 史志才 陈良 ZHANG Xiang;SHI Zhicai;CHEN Liang(College of Electronic and Electrical Engineering,Shanghai University of Engineering Science,Shanghai 201620,China)
出处 《传感器与微系统》 CSCD 2020年第11期148-151,共4页 Transducer and Microsystem Technologies
基金 国家自然科学基金资助项目(61802252)。
关键词 卷积神经网络 特征通道注意力机制 中心损失 表情识别 convolutional neural network attention mechanism of feature channel center loss expression recognition
  • 相关文献

参考文献4

二级参考文献22

共引文献130

同被引文献19

引证文献4

二级引证文献5

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部