Sparse optimization has witnessed advancements in recent decades,and the step function finds extensive applications across various machine learning and signal processing domains.This paper integrates zero norm and the...Sparse optimization has witnessed advancements in recent decades,and the step function finds extensive applications across various machine learning and signal processing domains.This paper integrates zero norm and the step function to formulate a doublesparsity constrained optimization problem,wherein a linear equality constraint is also taken into consideration.By defining aτ-Lagrangian stationary point and a KKT point,we establish the first-order and second-order necessary and sufficient optimality conditions for the problem.Furthermore,we thoroughly elucidate their relationships to local and global optimal solutions.Finally,special cases and examples are presented to illustrate the obtained theorems.展开更多
基金Supported by the National Key R&D Program of China(No.2023YFA1011100)NSFC(No.12131004)。
文摘Sparse optimization has witnessed advancements in recent decades,and the step function finds extensive applications across various machine learning and signal processing domains.This paper integrates zero norm and the step function to formulate a doublesparsity constrained optimization problem,wherein a linear equality constraint is also taken into consideration.By defining aτ-Lagrangian stationary point and a KKT point,we establish the first-order and second-order necessary and sufficient optimality conditions for the problem.Furthermore,we thoroughly elucidate their relationships to local and global optimal solutions.Finally,special cases and examples are presented to illustrate the obtained theorems.