期刊文献+

无监督特征选择的改进稀疏主成分分析算法 被引量:2

An improved sparse principal component analysis algorithm for unsupervised feature selection
下载PDF
导出
摘要 为了降低稀疏主成分分析(Sparse Principal Component Analysis,SPCA)算法对高维数据集的计算复杂度,提出一种改进SPCA(Improved Sparse Principal Component Analysis,ISPCA)算法。该算法将特征选择过程分为两个阶段,第一阶段利用不带低秩惩罚项的SPCA先对数据进行一次特征选择,得到降维数据,采用矩阵的广义逆引理降低算法复杂度。第二阶段在降维数据上执行带低秩惩罚项的SPCA对降维数据再次进行特征选择。对比实验结果表明,ISPCA算法比SPCA算法受参数影响较小,特征选择性能更优,运行速度更快。 In order to reduce the computational complexity of sparse principal component analysis(SPCA)algorithm for high dimensional data sets,an improved SPCA(ISPCA)algorithm is proposed.The feature selection process of ISPCA algorithm is divided into two stages.In the first stage,SPCA without low-rank penalty term is used to conduct feature selection on the data to obtain dimension-reduced data.The generalized inverse lemma of matrices is adopted to reduce the complexity of the algorithm.In the second stage,SPCA with low rank penalty term is performed on the dimension-reduced data to conduct another feature selection.The comparative experimental results show that the ISPCA algorithm is less affected by the parameters than the SPCA algorithm.It has better feature selection performance,and runs faster.
作者 范九伦 李维昊 罗绪瑞 支晓斌 FAN Jiulun;LI Weihao;LUO Xurui;ZHI Xiaobin(School of Communications and Information Engineering,Xi’an University of Posts and Telecommunications,Xi’an 710121,China)
出处 《西安邮电大学学报》 2022年第5期43-48,共6页 Journal of Xi’an University of Posts and Telecommunications
基金 国家自然科学基金项目(62071378,62071379,62071380,61901365) 陕西省自然科学基金项目(2020JM-580,2021JM-461) 西安邮电大学新星团队项目(xyt2016-01)。
关键词 主成分分析 无监督特征选择 行稀疏化 两阶段特征选择 矩阵的广义逆引理 principal component analysis unsupervised feature selection line of sparse two-stage feature selection generalized inverse lemma of matrices
  • 相关文献

参考文献5

二级参考文献24

  • 1Farina M,Amato P.On the optimal solution definition for many-criteria optimization problems[A].Proceedings of Fuzzy Information Processing Society[C].IEEE,2002.233-238.
  • 2Lopez-Jaimes A,Coello Coello C A.Including preferences into a multiobjective evolutionary algorithm to deal with many-objective engineering optimization problems[J].Information Sciences,2014,277:1-20.
  • 3Wagner T,Beume N,Naujoks B.Pareto-,aggregation-,and indicator-based methods in many-objective optimization[A].Evolutionary Multi-Criterion Optimization[C].EMO,Springer,2007.742-756.
  • 4Ishibuchi H,Tsukamoto N,Nojima Y.Evolutionary many-objective optimization:a short review[A].IEEE World Congress on Computational Intelligence[C].IEEE,2008.2419-2426.
  • 5Bringmann K,Friedrich T.Approximating the least hyper-volume contributor:NP-hard in general,but fast in practice[J].Theoretical Computer Science,2012,425:104-116.
  • 6López Jaimes A,Coello Coello C A,Chakraborty D.Objective reduction using a feature selection technique[A].Proceedings of the 10th Annual Conference on Genetic and Evolutionary Computation[C].ACM,2008.673-680.
  • 7Schutze O,Lara A,Coello C A C.On the influence of the number of objectives on the hardness of a multiobjective optimization problem[J].IEEE Transactions on Evolutionary Computation,2011,15(4):444-455.
  • 8Sinha A,Saxena D K,Deb K,et al.Using objective reduction and interactive procedure to handle many-objective optimization problems[J].Applied Soft Computing,2013,13(1):415-427.
  • 9Saxena D K,Duro J A,Tiwari A,et al.Objective reduction in many-objective optimization:linear and nonlinear algorithms[J].IEEE Transactions on Evolutionary Computation,2013,17(1):77-99.
  • 10Brockhoff D,Zitzler E.Objective reduction in evolutionary multiobjective optimization:theory and applications[J].Evolutionary Computation,2009,17(2):135-166.

共引文献26

同被引文献14

引证文献2

二级引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部