期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
Sparse Proximal Support Vector Machine with a Specialized Interior-Point Method 被引量:2
1
作者 Yan-Qin Bai Zhao-Ying Zhu Wen-Li Yan 《Journal of the Operations Research Society of China》 EI CSCD 2015年第1期1-15,共15页
Support vector machine(SVM)is a widely used method for classification.Proximal support vector machine(PSVM)is an extension of SVM and a promisingmethod to lead to a fast and simple algorithm for generating a classifie... Support vector machine(SVM)is a widely used method for classification.Proximal support vector machine(PSVM)is an extension of SVM and a promisingmethod to lead to a fast and simple algorithm for generating a classifier.Motivated by the fast computational efforts of PSVM and the properties of sparse solution yielded by l1-norm,in this paper,we first propose a PSVM with a cardinality constraint which is eventually relaxed byl1-norm and leads to a trade-offl1−l2 regularized sparse PSVM.Next we convert thisl1−l2 regularized sparse PSVM into an equivalent form of1 regularized least squares(LS)and solve it by a specialized interior-point method proposed by Kim et al.(J SelTop Signal Process 12:1932–4553,2007).Finally,l1−l2 regularized sparse PSVM is illustrated by means of a real-world dataset taken from the University of California,Irvine Machine Learning Repository(UCI Repository).Moreover,we compare the numerical results with the existing models such as generalized eigenvalue proximal SVM(GEPSVM),PSVM,and SVM-Light.The numerical results showthat thel1−l2 regularized sparsePSVMachieves not only better accuracy rate of classification than those of GEPSVM,PSVM,and SVM-Light,but also a sparser classifier compared with the1-PSVM. 展开更多
关键词 proximal support vector machine Classification accuracy Interior-point methods Preconditioned conjugate gradients algorithm
原文传递
Consensus Proximal Support Vector Machine for Classification Problems with Sparse Solutions 被引量:1
2
作者 Yan-Qin Bai Yan-Jun Shen Kai-Ji Shen 《Journal of the Operations Research Society of China》 EI 2014年第1期57-74,共18页
Classification problem is the central problem in machine learning.Support vector machines(SVMs)are supervised learning models with associated learning algorithms and are used for classification in machine learning.In ... Classification problem is the central problem in machine learning.Support vector machines(SVMs)are supervised learning models with associated learning algorithms and are used for classification in machine learning.In this paper,we establish two consensus proximal support vector machines(PSVMs)models,based on methods for binary classification.The first one is to separate the objective functions into individual convex functions by using the number of the sample points of the training set.The constraints contain two types of the equations with global variables and local variables corresponding to the consensus points and sample points,respectively.To get more sparse solutions,the second one is l1–l2 consensus PSVMs in which the objective function contains an■1-norm term and an■2-norm term which is responsible for the good classification performance while■1-norm term plays an important role in finding the sparse solutions.Two consensus PSVMs are solved by the alternating direction method of multipliers.Furthermore,they are implemented by the real-world data taken from the University of California,Irvine Machine Learning Repository(UCI Repository)and are compared with the existed models such as■1-PSVM,■p-PSVM,GEPSVM,PSVM,and SVM-light.Numerical results show that our models outperform others with the classification accuracy and the sparse solutions. 展开更多
关键词 Classification problems support vector machine proximal support vector machine CONSENSUS Alternating direction method of multipliers
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部