Computing tasks may often be posed as optimization problems.The objective functions for real-world scenarios are often nonconvex and/or nondifferentiable.State-of-the-art methods for solving these problems typically o...Computing tasks may often be posed as optimization problems.The objective functions for real-world scenarios are often nonconvex and/or nondifferentiable.State-of-the-art methods for solving these problems typically only guarantee convergence to local minima.This work presents Hamilton-Jacobi-based Moreau adaptive descent(HJ-MAD),a zero-order algorithm with guaranteed convergence to global minima,assuming continuity of the objective function.The core idea is to compute gradients of the Moreau envelope of the objective(which is"piece-wise convex")with adaptive smoothing parameters.Gradients of the Moreau envelope(i.e.,proximal operators)are approximated via the Hopf-Lax formula for the viscous Hamilton-Jacobi equation.Our numerical examples illustrate global convergence.展开更多
基金partially funded by AFOSR MURI FA9550-18-502,ONR N00014-18-1-2527,N00014-18-20-1-2093,N00014-20-1-2787supported by the NSF Graduate Research Fellowship under Grant No.DGE-1650604.
文摘Computing tasks may often be posed as optimization problems.The objective functions for real-world scenarios are often nonconvex and/or nondifferentiable.State-of-the-art methods for solving these problems typically only guarantee convergence to local minima.This work presents Hamilton-Jacobi-based Moreau adaptive descent(HJ-MAD),a zero-order algorithm with guaranteed convergence to global minima,assuming continuity of the objective function.The core idea is to compute gradients of the Moreau envelope of the objective(which is"piece-wise convex")with adaptive smoothing parameters.Gradients of the Moreau envelope(i.e.,proximal operators)are approximated via the Hopf-Lax formula for the viscous Hamilton-Jacobi equation.Our numerical examples illustrate global convergence.