摘要
This paper studies the distributed optimization problem when the objective functions might be nondifferentiable and subject to heterogeneous set constraints.Unlike existing subgradient methods,the authors focus on the case when the exact subgradients of the local objective functions can not be accessed by the agents.To solve this problem,the authors propose a projected primaldual dynamics using only the objective function’s approximate subgradients.The authors first prove that the formulated optimization problem can generally be solved with an error depending upon the accuracy of the available subgradients.Then,the authors show the exact solvability of this distributed optimization problem when the accumulated approximation error of inexact subgradients is not too large.After that,the authors also give a novel componentwise normalized variant to improve the transient behavior of the convergent sequence.The effectiveness of the proposed algorithms is verified by a numerical example.
基金
supported by the National Natural Science Foundation of China under Grant No.61973043。