期刊文献+
共找到5篇文章
< 1 >
每页显示 20 50 100
Analysis of loss functions in support vector machines
1
作者 Huajun WANG naihua xiu 《Frontiers of Mathematics in China》 CSCD 2023年第6期381-414,共34页
Support vector machines(SVMs)are a kind of important machine learning methods generated by the cross interaction of statistical theory and optimization,and have been extensively applied into text categorization,diseas... Support vector machines(SVMs)are a kind of important machine learning methods generated by the cross interaction of statistical theory and optimization,and have been extensively applied into text categorization,disease diagnosis,face detection and so on.The loss function is the core research content of SVM,and its variational properties play an important role in the analysis of optimality conditions,the design of optimization algorithms,the representation of support vectors and the research of dual problems.This paper summarizes and analyzes the 0-1 loss function and its eighteen popular surrogate loss functions in SVM,and gives three variational properties of these loss functions:subdifferential,proximal operator and Fenchel conjugate,where the nine proximal operators and fifteen Fenchel conjugates are given by this paper. 展开更多
关键词 Support vector machines loss function SUBDIFFERENTIAL proximal operator Fenchel conjugate
原文传递
New Bounds for RIC in Compressed Sensing 被引量:3
2
作者 Shenglong Zhou Lingchen Kong naihua xiu 《Journal of the Operations Research Society of China》 EI 2013年第2期227-237,共11页
This paper gives new bounds for restricted isometry constant(RIC)in compressed sensing.LetΦbe an m×n real matrix and k be a positive integer with k≤n.The main results of this paper show that if the restricted i... This paper gives new bounds for restricted isometry constant(RIC)in compressed sensing.LetΦbe an m×n real matrix and k be a positive integer with k≤n.The main results of this paper show that if the restricted isometry constant ofΦsat-isfiesδ8ak<1 andδk+ak<3/2−1+√(4a+3)^(2)−8/8aforα>3/8,then k-sparse solution can be recovered exactly via l1 minimization in the noiseless case.In particular,whenα=1,1.5,2 and3,we haveδ2k<0.5746 andδ8k<1,orδ2.5k<0.7046 andδ12k<1,orδ3k<0.7731 andδ16k<1 orδ4k<0.8445 andδ24k<1. 展开更多
关键词 Compressed sensing Restricted isometry constant BOUND l1 minimization Exact recovery
原文传递
A NOTE ON THE GRADIENT PROJECTION METHOD WITH EXACT STEPSIZE RULE 被引量:2
3
作者 naihua xiu Changyu Wang Lingchen Kong 《Journal of Computational Mathematics》 SCIE EI CSCD 2007年第2期221-230,共10页
In this paper, we give some convergence results on the gradient projection method with exact stepsize rule for solving the minimization problem with convex constraints. Especially, we show that if the objective functi... In this paper, we give some convergence results on the gradient projection method with exact stepsize rule for solving the minimization problem with convex constraints. Especially, we show that if the objective function is convex and its gradient is Lipschitz continuous, then the whole sequence of iterations produced by this method with bounded exact stepsizes converges to a solution of the concerned problem. 展开更多
关键词 Gradient projection method Exact stepsize rule Full convergence
原文传递
Lagrangian duality and saddle points for sparse linear programming 被引量:1
4
作者 Chen Zhao Ziyan Luo +2 位作者 Weiyue Li Houduo Qi naihua xiu 《Science China Mathematics》 SCIE CSCD 2019年第10期2015-2032,共18页
The sparse linear programming(SLP) is a linear programming problem equipped with a sparsity constraint, which is nonconvex, discontinuous and generally NP-hard due to the combinatorial property involved.In this paper,... The sparse linear programming(SLP) is a linear programming problem equipped with a sparsity constraint, which is nonconvex, discontinuous and generally NP-hard due to the combinatorial property involved.In this paper, by rewriting the sparsity constraint into a disjunctive form, we present an explicit formula of the Lagrangian dual problem for the SLP, in terms of an unconstrained piecewise-linear convex programming problem which admits a strong duality under bi-dual sparsity consistency. Furthermore, we show a saddle point theorem based on the strong duality and analyze two classes of stationary points for the saddle point problem. At last,we extend these results to SLP with the lower bound zero replaced by a certain negative constant. 展开更多
关键词 SPARSE linear programming LAGRANGIAN dual problem strong DUALITY SADDLE point THEOREM OPTIMALITY condition
原文传递
Global optimality condition and fixed point continuation algorithm for non-Lipschitz ?_p regularized matrix minimization 被引量:1
5
作者 Dingtao Peng naihua xiu Jian Yu 《Science China Mathematics》 SCIE CSCD 2018年第6期1139-1152,共14页
Regularized minimization problems with nonconvex, nonsmooth, even non-Lipschitz penalty functions have attracted much attention in recent years, owing to their wide applications in statistics, control,system identific... Regularized minimization problems with nonconvex, nonsmooth, even non-Lipschitz penalty functions have attracted much attention in recent years, owing to their wide applications in statistics, control,system identification and machine learning. In this paper, the non-Lipschitz ?_p(0 < p < 1) regularized matrix minimization problem is studied. A global necessary optimality condition for this non-Lipschitz optimization problem is firstly obtained, specifically, the global optimal solutions for the problem are fixed points of the so-called p-thresholding operator which is matrix-valued and set-valued. Then a fixed point iterative scheme for the non-Lipschitz model is proposed, and the convergence analysis is also addressed in detail. Moreover,some acceleration techniques are adopted to improve the performance of this algorithm. The effectiveness of the proposed p-thresholding fixed point continuation(p-FPC) algorithm is demonstrated by numerical experiments on randomly generated and real matrix completion problems. 展开更多
关键词 lp regularized matrix minimization matrix completion problem p-thresholding operator globaloptimality condition fixed point continuation algorithm
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部