Classification problem is the central problem in machine learning.Support vector machines(SVMs)are supervised learning models with associated learning algorithms and are used for classification in machine learning.In ...Classification problem is the central problem in machine learning.Support vector machines(SVMs)are supervised learning models with associated learning algorithms and are used for classification in machine learning.In this paper,we establish two consensus proximal support vector machines(PSVMs)models,based on methods for binary classification.The first one is to separate the objective functions into individual convex functions by using the number of the sample points of the training set.The constraints contain two types of the equations with global variables and local variables corresponding to the consensus points and sample points,respectively.To get more sparse solutions,the second one is l1–l2 consensus PSVMs in which the objective function contains an■1-norm term and an■2-norm term which is responsible for the good classification performance while■1-norm term plays an important role in finding the sparse solutions.Two consensus PSVMs are solved by the alternating direction method of multipliers.Furthermore,they are implemented by the real-world data taken from the University of California,Irvine Machine Learning Repository(UCI Repository)and are compared with the existed models such as■1-PSVM,■p-PSVM,GEPSVM,PSVM,and SVM-light.Numerical results show that our models outperform others with the classification accuracy and the sparse solutions.展开更多
Logistic regression has been proved as a promising method for machine learning,which focuses on the problem of classification.In this paper,we present anl_(1)-l_(2)-regularized logistic regression model,where thel1-no...Logistic regression has been proved as a promising method for machine learning,which focuses on the problem of classification.In this paper,we present anl_(1)-l_(2)-regularized logistic regression model,where thel1-norm is responsible for yielding a sparse logistic regression classifier and thel_(2)-norm for keeping betlter classification accuracy.To solve thel_(1)-l_(2)-regularized logistic regression model,we develop an alternating direction method of multipliers with embedding limitedlBroyden-Fletcher-Goldfarb-Shanno(L-BFGS)method.Furthermore,we implement our model for binary classification problems by using real data examples selected from the University of California,Irvine Machines Learning Repository(UCI Repository).We compare our numerical results with those obtained by the well-known LIBSVM and SVM-Light software.The numerical results show that ourl_(1)-l_(2)-regularized logisltic regression model achieves better classification and less CPU Time.展开更多
基金This work is supported by the National Natural Science Foundation of China(Grant No.11371242)and the“085 Project”in Shanghai University.
文摘Classification problem is the central problem in machine learning.Support vector machines(SVMs)are supervised learning models with associated learning algorithms and are used for classification in machine learning.In this paper,we establish two consensus proximal support vector machines(PSVMs)models,based on methods for binary classification.The first one is to separate the objective functions into individual convex functions by using the number of the sample points of the training set.The constraints contain two types of the equations with global variables and local variables corresponding to the consensus points and sample points,respectively.To get more sparse solutions,the second one is l1–l2 consensus PSVMs in which the objective function contains an■1-norm term and an■2-norm term which is responsible for the good classification performance while■1-norm term plays an important role in finding the sparse solutions.Two consensus PSVMs are solved by the alternating direction method of multipliers.Furthermore,they are implemented by the real-world data taken from the University of California,Irvine Machine Learning Repository(UCI Repository)and are compared with the existed models such as■1-PSVM,■p-PSVM,GEPSVM,PSVM,and SVM-light.Numerical results show that our models outperform others with the classification accuracy and the sparse solutions.
基金the National Natural Science Foundation of China(No.11371242)。
文摘Logistic regression has been proved as a promising method for machine learning,which focuses on the problem of classification.In this paper,we present anl_(1)-l_(2)-regularized logistic regression model,where thel1-norm is responsible for yielding a sparse logistic regression classifier and thel_(2)-norm for keeping betlter classification accuracy.To solve thel_(1)-l_(2)-regularized logistic regression model,we develop an alternating direction method of multipliers with embedding limitedlBroyden-Fletcher-Goldfarb-Shanno(L-BFGS)method.Furthermore,we implement our model for binary classification problems by using real data examples selected from the University of California,Irvine Machines Learning Repository(UCI Repository).We compare our numerical results with those obtained by the well-known LIBSVM and SVM-Light software.The numerical results show that ourl_(1)-l_(2)-regularized logisltic regression model achieves better classification and less CPU Time.