摘要
人脸识别中通常存在小样本问题,使得基于Fisher线性鉴别分析的特征抽取方法存在病态奇异问题。近年来针对此问题提出了不同的解决方法,其中基于共同鉴别矢量(DCV)的方法成功克服了已有各种方法存在的缺点,有较好的数值稳定性和较低的计算复杂度。该文将DCV方法推广到非线性领域,将两次Gram-Schmidt正交化过程,转化为只需计算两个核矩阵和进行一次Cholesky分解完成,且得到的非线性Fisher鉴别矢量有标准正交的性质。实验验证了所得KDCV方法的识别性能优于DCV方法。
Face recognition tasks always encounter Small Sample Size (SSS) problem, which leads to the ill-posed problem in Fisher Linear Discriminant Analysis (FLDA). The Discriminative Common Vector (DCV) successfully overcomes this problem for FLDA. In this paper, the DCV is extended to nonlinear case, by performing the Gram-Schmidt orthogonalization twice in feature space, which involving computing two kernel matrices and performing a Cholesky decomposition of a kernel matrix. The experimental results demonstrate that the proposed KDCV achieve better performance than the DCV method.
出处
《电子与信息学报》
EI
CSCD
北大核心
2006年第12期2296-2300,共5页
Journal of Electronics & Information Technology
基金
南京信息工程大学科研基金资助课题
关键词
人脸识别
鉴别共同矢量
核方法
小样本问题
FISHER线性鉴别分析
Face Recognition, Discriminative Common Vectors, Kemel method, Small Sample Size (SSS) problem, Fisher Linear Discriminant Analysis (FLDA)