Proximal gradient descent and its accelerated version are resultful methods for solving the sum of smooth and non-smooth problems. When the smooth function can be represented as a sum of multiple functions, the stocha...Proximal gradient descent and its accelerated version are resultful methods for solving the sum of smooth and non-smooth problems. When the smooth function can be represented as a sum of multiple functions, the stochastic proximal gradient method performs well. However, research on its accelerated version remains unclear. This paper proposes a proximal stochastic accelerated gradient (PSAG) method to address problems involving a combination of smooth and non-smooth components, where the smooth part corresponds to the average of multiple block sums. Simultaneously, most of convergence analyses hold in expectation. To this end, under some mind conditions, we present an almost sure convergence of unbiased gradient estimation in the non-smooth setting. Moreover, we establish that the minimum of the squared gradient mapping norm arbitrarily converges to zero with probability one.展开更多
We consider the fourth-order nonlinear Schr?dinger equation(4NLS)(i?t+εΔ+Δ2)u=c1um+c2(?u)um-1+c3(?u)2um-2,and establish the conditional almost sure global well-posedness for random initial data in Hs(Rd)for s∈(sc-...We consider the fourth-order nonlinear Schr?dinger equation(4NLS)(i?t+εΔ+Δ2)u=c1um+c2(?u)um-1+c3(?u)2um-2,and establish the conditional almost sure global well-posedness for random initial data in Hs(Rd)for s∈(sc-1/2,sc],when d≥3 and m≥5,where sc:=d/2-2/(m-1)is the scaling critical regularity of 4NLS with the second order derivative nonlinearities.Our proof relies on the nonlinear estimates in a new M-norm and the stability theory in the probabilistic setting.Similar supercritical global well-posedness results also hold for d=2,m≥4 and d≥3,3≤m<5.展开更多
文摘Proximal gradient descent and its accelerated version are resultful methods for solving the sum of smooth and non-smooth problems. When the smooth function can be represented as a sum of multiple functions, the stochastic proximal gradient method performs well. However, research on its accelerated version remains unclear. This paper proposes a proximal stochastic accelerated gradient (PSAG) method to address problems involving a combination of smooth and non-smooth components, where the smooth part corresponds to the average of multiple block sums. Simultaneously, most of convergence analyses hold in expectation. To this end, under some mind conditions, we present an almost sure convergence of unbiased gradient estimation in the non-smooth setting. Moreover, we establish that the minimum of the squared gradient mapping norm arbitrarily converges to zero with probability one.
基金supported by the NationalNatural Science Foundation of China(12001236)the Natural Science Foundation of Guangdong Province(2020A1515110494)。
文摘We consider the fourth-order nonlinear Schr?dinger equation(4NLS)(i?t+εΔ+Δ2)u=c1um+c2(?u)um-1+c3(?u)2um-2,and establish the conditional almost sure global well-posedness for random initial data in Hs(Rd)for s∈(sc-1/2,sc],when d≥3 and m≥5,where sc:=d/2-2/(m-1)is the scaling critical regularity of 4NLS with the second order derivative nonlinearities.Our proof relies on the nonlinear estimates in a new M-norm and the stability theory in the probabilistic setting.Similar supercritical global well-posedness results also hold for d=2,m≥4 and d≥3,3≤m<5.