The exact analytic method was given by [1] . It can be used for arbitrary variable coefficient differential equations and the solution obtained can have the second order convergent precision. In this paper, a new high...The exact analytic method was given by [1] . It can be used for arbitrary variable coefficient differential equations and the solution obtained can have the second order convergent precision. In this paper, a new high precision algorithm is given based on [1], through a bending problem of variable cross-section beams. It can have the fourth convergent precision without increasing computation work. The present computation method is not only simple but also fast. The numerical examples are given at the end of this paper which indicate that the high convergent precision can be obtained using only a few elements. The correctness of the theory in this paper is confirmed.展开更多
By using the Feynman-Kac formula and combining with Itˆo-Taylor expansion and finite difference approximation,we first develop an explicit third order onestep method for solving decoupled forward backward stochastic d...By using the Feynman-Kac formula and combining with Itˆo-Taylor expansion and finite difference approximation,we first develop an explicit third order onestep method for solving decoupled forward backward stochastic differential equations.Then based on the third order one,an explicit fourth order method is further proposed.Several numerical tests are also presented to illustrate the stability and high order accuracy of the proposed methods.展开更多
Alternating direction method of multipliers(ADMM)receives much attention in the recent years due to various demands from machine learning and big data related optimization.In 2013,Ouyang et al.extend the ADMM to the s...Alternating direction method of multipliers(ADMM)receives much attention in the recent years due to various demands from machine learning and big data related optimization.In 2013,Ouyang et al.extend the ADMM to the stochastic setting for solving some stochastic optimization problems,inspired by the structural risk minimization principle.In this paper,we consider a stochastic variant of symmetric ADMM,named symmetric stochastic linearized ADMM(SSL-ADMM).In particular,using the framework of variational inequality,we analyze the convergence properties of SSL-ADMM.Moreover,we show that,with high probability,SSL-ADMM has O((ln N)·N^(-1/2))constraint violation bound and objective error bound for convex problems,and has O((ln N)^(2)·N^(-1))constraint violation bound and objective error bound for strongly convex problems,where N is the iteration number.Symmetric ADMM can improve the algorithmic performance compared to classical ADMM,numerical experiments for statistical machine learning show that such an improvement is also present in the stochastic setting.展开更多
文摘The exact analytic method was given by [1] . It can be used for arbitrary variable coefficient differential equations and the solution obtained can have the second order convergent precision. In this paper, a new high precision algorithm is given based on [1], through a bending problem of variable cross-section beams. It can have the fourth convergent precision without increasing computation work. The present computation method is not only simple but also fast. The numerical examples are given at the end of this paper which indicate that the high convergent precision can be obtained using only a few elements. The correctness of the theory in this paper is confirmed.
基金supported by the NSF of China(No.12001539)the NSF of Hunan Province(No.2020JJ5647)China Postdoctoral Science Foundation(No.2019TQ0073).
文摘By using the Feynman-Kac formula and combining with Itˆo-Taylor expansion and finite difference approximation,we first develop an explicit third order onestep method for solving decoupled forward backward stochastic differential equations.Then based on the third order one,an explicit fourth order method is further proposed.Several numerical tests are also presented to illustrate the stability and high order accuracy of the proposed methods.
基金Supported by National Natural Science Foundation of China (61662036)。
文摘Alternating direction method of multipliers(ADMM)receives much attention in the recent years due to various demands from machine learning and big data related optimization.In 2013,Ouyang et al.extend the ADMM to the stochastic setting for solving some stochastic optimization problems,inspired by the structural risk minimization principle.In this paper,we consider a stochastic variant of symmetric ADMM,named symmetric stochastic linearized ADMM(SSL-ADMM).In particular,using the framework of variational inequality,we analyze the convergence properties of SSL-ADMM.Moreover,we show that,with high probability,SSL-ADMM has O((ln N)·N^(-1/2))constraint violation bound and objective error bound for convex problems,and has O((ln N)^(2)·N^(-1))constraint violation bound and objective error bound for strongly convex problems,where N is the iteration number.Symmetric ADMM can improve the algorithmic performance compared to classical ADMM,numerical experiments for statistical machine learning show that such an improvement is also present in the stochastic setting.