期刊文献+

FAST FEATURE RANKING AND ITS APPLICATION TO FACE RECOGNITION 被引量:1

FAST FEATURE RANKING AND ITS APPLICATION TO FACE RECOGNITION
下载PDF
导出
摘要 A fast feature ranking algorithm for classification in the presence of high dimensionahty and small sample size is proposed. The basic idea is that the important features force the data points of the same class to maintain their intrinsic neighbor relations, whereas neighboring points of different classes are no longer to stick to one an- other. Applying this assumption, an optimization problem weighting each feature is derived. The algorithm does not involve the dense matrix eigen-decomposition which can be computationally expensive in time. Extensive exper- iments are conducted to validate the significance of selected features using the Yale, Extended YaleB and PIE data- sets. The thorough evaluation shows that, using one-nearest neighbor classifier, the recognition rates using 100-- 500 leading features selected by the algorithm distinctively outperform those with features selected by the baseline feature selection algorithms, while using support vector machine features selected by the algorithm show less prominent improvement. Moreover, the experiments demonstrate that the proposed algorithm is particularly effi- cient for multi-class face recognition problem. A fast feature ranking algorithm for classification in the presence of high dimensionality and small sample size is proposed.The basic idea is that the important features force the data points of the same class to maintain their intrinsic neighbor relations,whereas neighboring points of different classes are no longer to stick to one another.Applying this assumption,an optimization problem weighting each feature is derived.The algorithm does not involve the dense matrix eigen-decomposition which can be computationally expensive in time.Extensive experiments are conducted to validate the significance of selected features using the Yale,Extended YaleB and PIE datasets.The thorough evaluation shows that,using one-nearest neighbor classifier,the recognition rates using 100— 500leading features selected by the algorithm distinctively outperform those with features selected by the baseline feature selection algorithms,while using support vector machine features selected by the algorithm show less prominent improvement.Moreover,the experiments demonstrate that the proposed algorithm is particularly efficient for multi-class face recognition problem.
出处 《Transactions of Nanjing University of Aeronautics and Astronautics》 EI 2013年第4期389-396,共8页 南京航空航天大学学报(英文版)
基金 Supported by the National Natural Science Foundation of China(71001072) the Natural Science Foundation of Guangdong Province(9451806001002294)
关键词 feature selection feature ranking manifold learning Laplacian matrix 航空 空气动力学 类型 直升飞机
  • 相关文献

参考文献19

  • 1Ng A. Feature selection.L1 vs.L2 regularization,and rotational invariance[A].New York,USA:ACM Press,2004.78.
  • 2Gilad-Bachrach R,Navot A,Tishby N. Margin based feature selection-theory and algorithms[A].New York,USA:ACM Press,2004.43.
  • 3Weston J,Mukherjee S,Chapelle O. Feature selection for SVMs[A].Vancouver,Canada:M IT Press,2001.668-674.
  • 4Guyon I,Elisseeff A. An introduction to variable and feature selection[J].Journal of Machine Learning Re-search,2003.1157-1182.
  • 5Kira K,Rendell L. A practical approach to feature selection[A].San Francisco,USA:Morgan Kaufmann Publishers Inc,1992.249-256.
  • 6Sun Y,Li J. Iterative RELIEF for feature weighting[A].New York,USA:ACM Press,2006.913-920.
  • 7Sun Y,Todorovic S,Goodison S. Local-learning-based feature selection for high-dimensional data a-nalysis[J].{H}IEEE Transactions on Pattern Analysis and Machine Intelligence,2010,(09):1610-1626.
  • 8Qiao Lishan,Zhang Limei,Sun Zhonggui. Self de-pendent locality preserving projection with trans-formed space oriented neighborhood graph[J].{H}Transactions of Nanjing University of Aeronautics and Astronautics,2010,(03):261-268.doi:10.3969/j.issn.1005-1120.2010.03.010.
  • 9Yu L,Liu H. Efficient feature selection via analysis of relevance and redundancy[J].Journal of M achine Learning Research,2004.1205-1224.
  • 10Ding Weiping. Minimum attribute co-reduction algo-rithm based on multilevel evolutionary tree with self-adaptive subpopulations[J].{H}Transactions of Nanjing University of Aeronautics and Astronautics,2013,(02):175-184.

同被引文献17

  • 1焦娜,迟呈英,苗夺谦,杨红.基于软K段主曲线算法的字符特征提取研究及实现[J].计算机科学,2006,33(1):229-231. 被引量:4
  • 2Hastie T. Principal curves[J]. Journal of the American Statistical Association, 1989, 84(406):502-516.
  • 3Banfield J D, Raftery A E. Ice floe identification in satellite images using mathematical morphology and clustering about prin- cipal curves[J]. Journal of the American Statistical Association, 1992, 87(417): 7-16.
  • 4Kegl B, Krzyzak A, Linder T, et al. A polygonal line algorithm for constructing principal curves[C]//Proceedings of Neural Information Processing System. Denver Colorado, USA: Computer Press, 1999:501-507.
  • 5Verbeek J J, Vlassis N, Krose B. A K-segments algorithm for finding principal curves[J]. Computer Science of Institute, University of Amsterdam, Technical Report: Pattern Recognition I.etters, 2002, 23 (8).. 1009-1017.
  • 6Verbeek J J, Vlassis N, KrOse B. A soft K-segments algorithm for principal curves[J]. Lecture Notes in Computer Science, 2001,17(3) :450-456.
  • 7Delicado P. Another look at principal curves and surfaces[J]. Journal of Multivariate Analysis, 2001, 7(1):84-116.
  • 8Bas E, Erdogmus D. Principal curves as skeletons of tubular objects: Locally characterizing the structures of axons[J]. Neu- roinformatics, 2011, 9 (2/3): 181-191.
  • 9Wang H N, Thomas C M L. Extraction of curvilinear features from noisy point patterns using principal curves[J]. Pattern Recognition Letter, 2008, 29 (16): 2078-2084.
  • 10Reinhard K, Niranjan M. Parametric subspace modeling of speech transitions[J].Speech Communication, 1999, 27 (1) : 19- 42.

引证文献1

二级引证文献3

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部