Proximal gradient descent and its accelerated version are resultful methods for solving the sum of smooth and non-smooth problems. When the smooth function can be represented as a sum of multiple functions, the stocha...Proximal gradient descent and its accelerated version are resultful methods for solving the sum of smooth and non-smooth problems. When the smooth function can be represented as a sum of multiple functions, the stochastic proximal gradient method performs well. However, research on its accelerated version remains unclear. This paper proposes a proximal stochastic accelerated gradient (PSAG) method to address problems involving a combination of smooth and non-smooth components, where the smooth part corresponds to the average of multiple block sums. Simultaneously, most of convergence analyses hold in expectation. To this end, under some mind conditions, we present an almost sure convergence of unbiased gradient estimation in the non-smooth setting. Moreover, we establish that the minimum of the squared gradient mapping norm arbitrarily converges to zero with probability one.展开更多
In this paper, a new method to address the scheduling problem of a renewable energy community while considering network constraints and users' privacy preservation is proposed. The method decouples the optimizatio...In this paper, a new method to address the scheduling problem of a renewable energy community while considering network constraints and users' privacy preservation is proposed. The method decouples the optimization solution into two interacting procedures: conic projection(CP) and linear programming(LP) optimization. A new optimal CP method is proposed based on local computations and on the calculation of the roots of a fourth-order polynomial for which a closed-form solution is known. Computational tests conducted on both 14-bus and 84-bus distribution networks demonstrate the effectiveness of the proposed method in obtaining the same quality of solutions compared with that by a centralized solver. The proposed method is scalable and has features that can be implemented on microcontrollers since both LP and CP procedures require only simple matrix-vector multiplications.展开更多
文摘Proximal gradient descent and its accelerated version are resultful methods for solving the sum of smooth and non-smooth problems. When the smooth function can be represented as a sum of multiple functions, the stochastic proximal gradient method performs well. However, research on its accelerated version remains unclear. This paper proposes a proximal stochastic accelerated gradient (PSAG) method to address problems involving a combination of smooth and non-smooth components, where the smooth part corresponds to the average of multiple block sums. Simultaneously, most of convergence analyses hold in expectation. To this end, under some mind conditions, we present an almost sure convergence of unbiased gradient estimation in the non-smooth setting. Moreover, we establish that the minimum of the squared gradient mapping norm arbitrarily converges to zero with probability one.
文摘In this paper, a new method to address the scheduling problem of a renewable energy community while considering network constraints and users' privacy preservation is proposed. The method decouples the optimization solution into two interacting procedures: conic projection(CP) and linear programming(LP) optimization. A new optimal CP method is proposed based on local computations and on the calculation of the roots of a fourth-order polynomial for which a closed-form solution is known. Computational tests conducted on both 14-bus and 84-bus distribution networks demonstrate the effectiveness of the proposed method in obtaining the same quality of solutions compared with that by a centralized solver. The proposed method is scalable and has features that can be implemented on microcontrollers since both LP and CP procedures require only simple matrix-vector multiplications.