期刊文献+

核诱导距离度量的鲁棒判别分析

Robust Discriminant Analysis Based on Kernel-Induced Measure
下载PDF
导出
摘要 提出了基于核诱导距离度量的鲁棒判别分析算法(robust discriminant analysis based on kernel-induced distance measure,KI-RDA)。KI-RDA不仅自然地推广了线性判别分析(linear discriminant analysis,LDA),而且推广了最近提出的强有力的基于非参数最大熵的鲁棒判别分析(robust discriminant analysis based on nonparametric maximum entropy,MaxEnt-RDA)。通过采用鲁棒径向基核,KI-RDA不仅能有效处理含噪数据,而且也适合处理非高斯分布的非线性数据,其本质的鲁棒性归咎于KI-RDA通过核诱导的非欧距离代替LDA的欧氏距离来刻画类间散度和类内散度。借助这些散度,为特征提取定义类似LDA的判别准则,导致了相应的非线性优化问题。进一步借助近似策略,将优化问题转化为直接可解的广义特征值问题,由此获得降维变换(矩阵)的闭合解。最后在多类数据集上进行实验,验证了KI-RDA的有效性。由于核的多样性,使KI-RDA事实上成为了一个一般性判别分析框架。 This paper proposes a robust discriminant analysis based on kernel-induced distance measure (KI-RDA). KI-RDA not only extends the linear discriminant analysis (LDA), but also extends the newest and powerful algorithm called robust discriminant analysis based on nonparametric maximum entropy (MaxEnt-RDA). By using robust radial basis function (RBF) kernels, KI-RDA can effectively deal with the data mixed with noise as well as the non-Gaussian distributed nonlinear data. Its robustness is accredited to that KI-RDA makes use of the kernel-induced non-Euclidean distance instead of the Euclidean distance in LDA to characterize the within-class and between-class divergence respectively. With the aid of these divergences, the paper defines a discriminant criterion which is similar to LDA for feature extraction, but this leads to a corresponding nonlinear optimization problem. With the further help of approximation strategy, the problem is converted into a generalized eigenvalue problem which can be solved directly so as to get a closed-form solution of the dimensionality reduction matrix. At last, experiments on multifold datasets verify the effectiveness of KI-RDA. Because of the diversity of kernel functions, KI-RDA is actually a general discriminant analysis framework.
出处 《计算机科学与探索》 CSCD 2012年第9期788-796,共9页 Journal of Frontiers of Computer Science and Technology
基金 国家自然科学基金No.61170151~~
关键词 降维 判别分析 核诱导的距离 鲁棒性 dimensionality reduction discriminant analysis kernel-induced distance robustness
  • 相关文献

参考文献20

  • 1Duda R O, Hart P E, Stork D G. Pattern classification[M]. 2nd ed. Hoboken: Wiley-Interscience, 2000: 94-97.
  • 2Muller K R, Mika S, Ratsch G. An introduction to kernelbased learning algorithms[J]. IEEE Transactions on Neural Network, 2001, 12(2): 181-202.
  • 3Park C H, Park H. Nonlinear discriminant analysis using kernel functions and the generalized singular value decomposition[J]. SIAM Journal on Matrix Analysis and Applications, 2005, 27(1): 87-102.
  • 4He Ran, Hu Baogang, Zheng Weishi, et al. Robust principalcomponent analysis based on maximum con'entropy criterion[J]. IEEE Transactions on Image Processing, 2011, 20(6): 1485-1494.
  • 5Principe J C, Xu Dongxin, Fisher J W. Information theoretic learning[M]//Haykin S. Unsupervised Adaptive Filtering. New York: Wiley, 2000: 265-319.
  • 6He Ran, Yuan Xiaotong, Hu Baogang, et al. Principal component analysis based on non-parametric maximum entropy[J]. Neurocomputing, 2010, 73(10/12): 1840-1852.
  • 7Yuan Xiaotong, Hu Baogang. Robust feature extraction via information theoretic learning[C]//Proceedings of the 26th Annual International Conference on Machine Learning (ICML '09), Montreal, Canada, 2009. New York, NY, USA: ACM, 2009: 1193-1200.
  • 8Kaski S, Peltonen J. Informative discriminant analysis[C]// Proceedings of the 20th International Conference on Machine Learning (ICML '03), Menlo Park, CA, USA, 2003: 329-336.
  • 9He Ran, Zheng Weishi, Hu Baogang. Maximum con'entropy criterion for robust face recognition[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2011, 31(8): 1561-1576.
  • 10He Ran, Sun Zhenan, Tan Tieniu, et al. Recovery of corrupted low-rank matrices via half-quadratic based nonconvex minimization[C]//Proceedings of the 2011 IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2011), Providence, USA, 2011: 2889-2896.

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部