期刊文献+

基于光流方向信息熵统计的微表情捕捉 被引量:2

Capture of microexpressions based on the entropy of oriented optical flow
原文传递
导出
摘要 以光流法为依据,提出了一种基于光流方向信息熵(entropy of oriented optical flow,EOF)统计的方法捕捉微表情关键帧.首先,采用改进的Horn-Schunck光流法提取视频流中相邻两帧图像的微表情运动特征;其次,采用阈值分析法筛选出投影速度模值较大的光流向量;之后,采用图像信息熵统计光流变化角度,进而得到视频序列的方向信息熵向量,通过对熵向量的分析,实现微表情关键帧捕捉;最后,本实验采用芬兰奥卢大学的SMIC微表情数据库和中国科学院心理研究所傅小兰的CASME微表情数据库作为实验样本,通过与传统的帧差法比较,证明了本文提出的算法优于帧差法,能够较好地表现出微表情变化趋势,为微表情识别提供基础. This paper proposes an algorithm that is effective in detecting the key frame of microexpression based on the entropy of oriented optical flow. Initially,this paper used an improved Horn-Schunck optical flow to extract the motion features of adjacent frames. Then,the threshold algorithm was used to filter the optical flow vectors with high-projective modulus. To capture the key frame of microexpression,the paper used information entropy to count the direction of optical flow vectors and analyzed the changing of microexpressions using an entropy vector of video sequences. Finally,the algorithm in this paper was verified with microexpression database SMIC( Oulu University) and CASME( the Director of the Institute of Psychology at the Chinese Academy of Sciences,Fu Xiaolan). Compared with traditional frame differences,experiments show that the algorithm is good not only in expressing the trend of the microexpression but also in providing the basis for microexpression recognition.
作者 李丹 解仑 卢婷 韩晶 胡波 王志良 任福继 LI Dan;XIE Lun;LU ling;HAN Jing;HU Bo;WANG Zhi-liang;REN Fu-ji(School of Computer and Communication Engineering, University of Science and Technology Beijing, Beijing 100083, China;School of Computer and Information, Hefei University of Technology, Hefei 230009, China)
出处 《工程科学学报》 EI CSCD 北大核心 2017年第11期1727-1734,共8页 Chinese Journal of Engineering
基金 国家自然科学基金资助项目(61672093 61432004) 国家重点研发计划重点专项课题资助项目(2016YFB1001404)
关键词 微表情 H-S光流 方向信息熵 帧差法 关键帧捕捉 microexpression Horn-Schunck optical flow oriented optical flow frame difference key-frame capture
  • 相关文献

参考文献3

二级参考文献81

  • 1Cohn, J. F., Kruez, T. S., Matthews, I., Yang Y., Nguyen, M. H., Padilla M. T Torre, De la. F. (2009). Detecting depression from facial actions and vocal prosody. In: Proceedings of International Conference. Affective Computing and Intelligent Interaction. Retrieved December 28, 2009, from http://www.andrew.cmu.edu/ usor/minhhoan/papers/acii-paper_final.pdf.
  • 2Darwin, C. (1998). The Expression of the Emotions in Man and Animals, 3rd edit. Introduction, afterwords, and commentaries by Paul Ekman. London, UK: HarperCollins New York, US: Oxford University Press.
  • 3Depaulo, B. M., & Bond, C. F. (2006). Accuracy of deception judgments. Personality and Social P~ychology Review, 10, 214-234.
  • 4Ekman, P. (1992). Facial expressions of emotion: An old controversy and new findings. Philosophical Transactions of the Royal Society of London, Series B: Biological Science, B355, 63-69.
  • 5Ekman, P. (2002). MicroExpression Training Tool (METT). Retrieved April 15, 2009, from http://www.paulekman. com.
  • 6Ekman, P. (2003). Darwin, deception, and facial expression. Annals of the New York Academy of Sciences, 1000 (Emotions Inside Out: 130 Years after Darwin's The Expression of the Emotions in Man and Animals): 205-221.
  • 7Ekman, P. (2009). Lie catching and microexpressions. In C. Martin (Ed.): The Philosophy of Deception (pp. 118-133). Oxford: Oxford University Press.
  • 8Ekman, P., & Friesen, W. V. (1969). Nonverbal leakage and clues to deception. Psychiatry, 32, 88-97.
  • 9Ekman, E, & W. Fricsen.(1974). Nonverbal behavior and psychopathology. In R. J. Friedman & M. M. Katz (Eds.): The Psychology of Depression: Contemporary Theory and Research (pp. 203-224). Washington D. C.: Winston &Sons.
  • 10Ekman, P., Friesen, W. V., & Hagar, J. C. (1976/2002). Facial Action Coding System. Salt Lake City, UT: Network Information Research (Original work published 1976).

共引文献126

同被引文献13

引证文献2

二级引证文献11

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部