期刊文献+

具有同步化特征选择的迭代紧凑非平行支持向量聚类算法 被引量:7

Iterative Tighter Nonparallel Hyperplane Support Vector Clustering with Simultaneous Feature Selection
下载PDF
导出
摘要 本文提出了一种新的带有同步化特征选择的聚类算法,称为"具有同步化特征选择的迭代紧凑非平行支持向量聚类算法"(IT-NHSVC-SFS).在具有两个非平行超平面的学习模型中使用迭代(交替)优化算法完成聚类,同时引入两种类型的正则项,分别是欧几里得范数和无穷范数,欧几里得范数用于提升聚类模型的泛化能力,无穷范数实际上是对两个非平行超平面进行同步化地隐式特征抽取,从而降低来自于不相关特征的聚类噪音,保证了模型的聚类精度,并引入一组束缚变量(bounding variables)避免无穷范数的最大化操作,将非凸优化问题转化成二次凸优化问题.同时,由于新提出的模型体现着"最大间隔"的思想,因此具有良好的泛化能力.为了方便实现两个非平行超平面同步化的特征选择过程,文中将非平行超平面SVM(Nonparallel Hyperplane SVM,NHSVM)作为IT-NHSVC-SFS算法的基础模型,因此和TWSVM以及它的变体模型不同的是:只需要求解一个二次规划问题(QP问题)就可以同时得到两个最优超平面.同时,新算法在原有的NHSVM模型的约束条件集合中新添加了两组等式约束条件,从而无需进行原有模型中的两个大矩阵的求逆操作,降低了计算复杂度.此外,在IT-NHSVC-SFS模型中,用拉普拉斯损失函数(Laplacian loss measure)代替了NHSVM模型原有的铰链损失函数(hinge loss function),避免了算法早熟收敛(premature convergence).在一组标准数据集上的数值实验结果表明,相对于其他已有的聚类算法,IT-NHSVC-SFS算法在聚类精度方面具有更好的表现. In this paper,a new clustering algorithm with simultaneous feature selection is proposed,which is called iterative tighter nonparallel support vector clustering with simultaneous feature selection(IT-NHSVC-SFS).In learning with two nonparallel hyperplanes model,we use the iterative(alternating) optimization algorithm to achieve clustering,and at the same time introduce two types of regularizes,the Euclidean norm and the infinite norm,respectively.Euclidean norm clustering model is used to improve the generalization ability and the infinite norm actually fulfills implicit feature extraction for the two nonparallel hyperplanes in order to reduce data noises from irrelevant features,and the clustering precision of the model is guaranteed.We also introduce a set of bounding variables to avoid maximization operation of the infinite norm,converting the non-convex optimization problem into a quadratic convex optimization problem.Meanwhile,because the new model embodies the idea of "maximum margin",it has good generalization ability.IT-NHSVC-SFS chooses nonparallel hyperplanes SVM(NHSVM) as the basis of the algorithm model.Unlike TWSVM and its variant models,only a quadratic programming problem(QP problem) needs to be solved to get the two optimal hyperplane simultaneously.This property is helpful to design a synchronous feature selection process for two nonparallel hyperplanes.The new algorithm adds two sets of equality constraints in the constraint set of the original NHSVM model,which can avoid the inverse operation of two large matrices and reduce the computational complexity.In addition,in the IT-NHSVC-SFS model,the Laplacian loss function replaces the original hinge loss function in NHSVM to avoid premature convergence.Numerical experiments on a set of benchmark data sets show that IT-NHSVC-SFS algorithm performs better in terms of clustering accuracy than other existing clustering algorithms.
作者 方佳艳 刘峤 FANG Jia-yan;LIU Qiao(School of Information and Software Engineering,University of Electronic Science and Technology of China,Chengdu,Sichuan 611731,China;CETC Big Data Research Institute Co.,Ltd.,Guiyang,Guizhou 550022,China)
出处 《电子学报》 EI CAS CSCD 北大核心 2020年第1期44-58,共15页 Acta Electronica Sinica
基金 国家自然科学基金(No.61772117) “十三五”装备预研领域基金(No.6140312010203) 军委科技委前沿探索项目(No.1816321TS00105301) 四川省科技服务业示范项目(No.2018GFW0150) 中电集团公司第五十四研究所开放课题(No.185686)
关键词 聚类 特征选择 非平行超平面支持向量机 无穷范数 clustering feature selection nonparallel hyperplane support vector machine L-infinite norm
  • 相关文献

参考文献9

二级参考文献171

  • 1Zhong-Da Tian,Xian-Wen Gao,Kun Li.A Hybrid Time-delay Prediction Method for Networked Control System[J].International Journal of Automation and computing,2014,11(1):19-24. 被引量:8
  • 2Cortes C, Vapnik V. Support vector networks [ J ]. Machine Learning, 1995,20(3) :273 - 297.
  • 3Bayro-Corrochano E J,Arana-Daniel N. Clifford support vector machines for classification, regression, and recurrence[J]. IEEE Trans on Neural Networks,2010,21 ( 11 ) : 1731 - 1746.
  • 4Ertekin S, et al. Nonconvex online support vector machines[ J]. IEEE, Trans on Pattern Analysis and Machine Intelligence, 2011,33(2) :368 - 381.
  • 5Jayadeva, et al. Twin support vector machines for pattern classi- fication[ J]. IEEE Trans on Pattem Analysis and Machine Intel- ligence, 2007,29(5) :905 - 910.
  • 6Shao Y H, et al. Improvements on twin support vector machines [ J]. IEEE. Trans on Neural Networks,2011,22(6) :962- 968.
  • 7Qi Z Q, Tian Y J, et al. Robust twin support vector machine for pattern classification[ J]. Pattern classification, 2013,46( 1 ) :305 - 316.
  • 8Peng X J, Xu Dong. Bi-density twin support vector machines for pattern recognition[ J ]. Neurocompufing, 2013,99 ( 1 ): 134 - 143.
  • 9Ye Q L, 2laao C X, et al. Weighted twin support vector ma- chines with local information and its application [ J ]. Neural Networks,2012,35(11) :31 - 39.
  • 10Kumar M, Gopal M. I_east squares twin support vector ma- chines for pattern classification[ J]. Expert System with Appli- cations, 2009,36(4) : 7535 - 7543.

共引文献154

同被引文献70

引证文献7

二级引证文献5

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部