Manifold optimization is ubiquitous in computational and appliedmathematics,statistics,engineering,machine learning,physics,chemistry,etc.One of the main challenges usually is the non-convexity of the manifold constra...Manifold optimization is ubiquitous in computational and appliedmathematics,statistics,engineering,machine learning,physics,chemistry,etc.One of the main challenges usually is the non-convexity of the manifold constraints.By utilizing the geometry of manifold,a large class of constrained optimization problems can be viewed as unconstrained optimization problems on manifold.From this perspective,intrinsic structures,optimality conditions and numerical algorithms for manifold optimization are investigated.Some recent progress on the theoretical results of manifold optimization is also presented.展开更多
In this paper,we investigate a parallel subspace correction framework for composite convex optimization.The variables are first divided into a few blocks based on certain rules.At each iteration,the algorithms solve a...In this paper,we investigate a parallel subspace correction framework for composite convex optimization.The variables are first divided into a few blocks based on certain rules.At each iteration,the algorithms solve a suitable subproblem on each block simultaneously,construct a search direction by combining their solutions on all blocks,then identify a new point along this direction using a step size satisfying the Armijo line search condition.They are called PSCLN and PSCLO,respectively,depending on whether there are overlapping regions between two imme-diately adjacent blocks of variables.Their convergence is established under mild assumptions.We compare PSCLN and PSCLO with the parallel version of the fast iterative thresholding algorithm and the fixed-point continuation method using the Barzilai-Borwein step size and the greedy coordinate block descent method for solving the l1-regularized minimization problems.Our numerical results showthatPSCLN andPSCLOcan run fast and return solutions notworse than those from the state-of-theart algorithms on most test problems.It is also observed that the overlapping domain decomposition scheme is helpful when the data of the problem has certain special structures.展开更多
Mathematical optimization is one of the foundations of fields such as operations research,computational mathematics,scientific and engineering computing,machine learning,and data sciences.Typical tasks usually include...Mathematical optimization is one of the foundations of fields such as operations research,computational mathematics,scientific and engineering computing,machine learning,and data sciences.Typical tasks usually include formulating appropriate mathematical models to describe related practical problems,designing suitable numerical methods to find optimal solutions,and exploring the theoretical properties of the models and algorithms.The evolution of modern computer architecture and the popularity of big/complex/smart data had a significant impact on mathematical optimization.The success of large-scale optimization in machine learning and signal processing certainly provides a very exciting paradigm on“Modeling+algorithms+computing power”.展开更多
In this paper,we present a unified framework for decision making under uncertainty.Our framework is based on the composite of two risk measures,where the inner risk measure accounts for the risk of decision if the exa...In this paper,we present a unified framework for decision making under uncertainty.Our framework is based on the composite of two risk measures,where the inner risk measure accounts for the risk of decision if the exact distribution of uncertain model parameters were given,and the outer risk measure quantifies the risk that occurs when estimating the parameters of distribution.We show that the model is tractable under mild conditions.The framework is a generalization of several existing models,including stochastic programming,robust optimization,distributionally robust optimization.Using this framework,we study a few new models which imply probabilistic guarantees for solutions and yield less conservative results compared to traditional models.Numerical experiments are performed on portfolio selection problems to demonstrate the strength of our models.展开更多
Optimization is one of the fundamental and essential components of operations research,a highly interdisciplinary subject.As one of the first researchers of the interior-point methods,Professor Yin-Yu Ye is responsibl...Optimization is one of the fundamental and essential components of operations research,a highly interdisciplinary subject.As one of the first researchers of the interior-point methods,Professor Yin-Yu Ye is responsible not only for developing many fundamental results,which have tremendously advanced the optimization theory,but also for enriching the field by applications emerging from statistics,machine learning,signal and imaging processing,communications,computational economics and finance.Computational methods and theory using semidefinite programming have been demonstrated to be helpful for the localization of network sensors.In computational economics,new complexity results have been established for problems related to the computation of an economic equilibrium.We appreciate Ye for his insatiable curiosity,openness to new ideas and a keen interest in the success of young people in our field of operations research.展开更多
This special issue focuses on sparse and low-rank optimization,a new distinct area of research in optimization.A solution is sparse if it has very few nonzero entries(compared to its dimension)or possesses other kinds...This special issue focuses on sparse and low-rank optimization,a new distinct area of research in optimization.A solution is sparse if it has very few nonzero entries(compared to its dimension)or possesses other kinds of simple structures,particularly,for example,low-rank matrices.Owing much to the studies of signal representation,compressive sensing,and regularized regression,sparse and low-rank optimization has been recognized as a computational tool that plays central roles inmany data processing problems,especially those involving extremely large data.The development of sparse and low-rank optimization has been motivated by,and nurturing,the development in many other areas of data science.展开更多
基金Xin Liu’s research was supported in part by the National Natural Science Foundation of China(No.11971466)Key Research Program of Frontier Sciences,Chinese Academy of Sciences(No.ZDBS-LY-7022)+1 种基金the National Center for Mathematics and Interdisciplinary Sciences,Chinese Academy of Sciences and the Youth Innovation Promotion Association,CAS.Zai-Wen Wen’s research was supported in part by the the National Natural Science Foundation of China(Nos.11421101 and 11831002)the Beijing Academy of Artificial Intelligence.Ya-Xiang Yuan’s research was supported in part by the National Natural Science Foundation of China(Nos.11331012 and 11461161005).
文摘Manifold optimization is ubiquitous in computational and appliedmathematics,statistics,engineering,machine learning,physics,chemistry,etc.One of the main challenges usually is the non-convexity of the manifold constraints.By utilizing the geometry of manifold,a large class of constrained optimization problems can be viewed as unconstrained optimization problems on manifold.From this perspective,intrinsic structures,optimality conditions and numerical algorithms for manifold optimization are investigated.Some recent progress on the theoretical results of manifold optimization is also presented.
基金Qian Dong was supported in part by the National Natural Science Foundation of China(Nos.11331012,11321061 and 11461161005)Xin Liu was supported in part by the National Natural Science Foundation of China(Nos.11101409,11331012,11471325 and 11461161005)+3 种基金China 863 Program(No.2013AA122902)the National Center for Mathematics and Interdisciplinary Sciences,Chinese Academy of SciencesZai-Wen Wen was supported in part by the National Natural Science Foundation of China(Nos.11322109 and 91330202)Ya-Xiang Yuan was supported in part by the National Natural Science Foundation of China(Nos.11331012,11321061 and 11461161005).
文摘In this paper,we investigate a parallel subspace correction framework for composite convex optimization.The variables are first divided into a few blocks based on certain rules.At each iteration,the algorithms solve a suitable subproblem on each block simultaneously,construct a search direction by combining their solutions on all blocks,then identify a new point along this direction using a step size satisfying the Armijo line search condition.They are called PSCLN and PSCLO,respectively,depending on whether there are overlapping regions between two imme-diately adjacent blocks of variables.Their convergence is established under mild assumptions.We compare PSCLN and PSCLO with the parallel version of the fast iterative thresholding algorithm and the fixed-point continuation method using the Barzilai-Borwein step size and the greedy coordinate block descent method for solving the l1-regularized minimization problems.Our numerical results showthatPSCLN andPSCLOcan run fast and return solutions notworse than those from the state-of-theart algorithms on most test problems.It is also observed that the overlapping domain decomposition scheme is helpful when the data of the problem has certain special structures.
文摘Mathematical optimization is one of the foundations of fields such as operations research,computational mathematics,scientific and engineering computing,machine learning,and data sciences.Typical tasks usually include formulating appropriate mathematical models to describe related practical problems,designing suitable numerical methods to find optimal solutions,and exploring the theoretical properties of the models and algorithms.The evolution of modern computer architecture and the popularity of big/complex/smart data had a significant impact on mathematical optimization.The success of large-scale optimization in machine learning and signal processing certainly provides a very exciting paradigm on“Modeling+algorithms+computing power”.
文摘In this paper,we present a unified framework for decision making under uncertainty.Our framework is based on the composite of two risk measures,where the inner risk measure accounts for the risk of decision if the exact distribution of uncertain model parameters were given,and the outer risk measure quantifies the risk that occurs when estimating the parameters of distribution.We show that the model is tractable under mild conditions.The framework is a generalization of several existing models,including stochastic programming,robust optimization,distributionally robust optimization.Using this framework,we study a few new models which imply probabilistic guarantees for solutions and yield less conservative results compared to traditional models.Numerical experiments are performed on portfolio selection problems to demonstrate the strength of our models.
文摘Optimization is one of the fundamental and essential components of operations research,a highly interdisciplinary subject.As one of the first researchers of the interior-point methods,Professor Yin-Yu Ye is responsible not only for developing many fundamental results,which have tremendously advanced the optimization theory,but also for enriching the field by applications emerging from statistics,machine learning,signal and imaging processing,communications,computational economics and finance.Computational methods and theory using semidefinite programming have been demonstrated to be helpful for the localization of network sensors.In computational economics,new complexity results have been established for problems related to the computation of an economic equilibrium.We appreciate Ye for his insatiable curiosity,openness to new ideas and a keen interest in the success of young people in our field of operations research.
文摘This special issue focuses on sparse and low-rank optimization,a new distinct area of research in optimization.A solution is sparse if it has very few nonzero entries(compared to its dimension)or possesses other kinds of simple structures,particularly,for example,low-rank matrices.Owing much to the studies of signal representation,compressive sensing,and regularized regression,sparse and low-rank optimization has been recognized as a computational tool that plays central roles inmany data processing problems,especially those involving extremely large data.The development of sparse and low-rank optimization has been motivated by,and nurturing,the development in many other areas of data science.