The stochastic gradient (SG) algorithm has less of a computational burden than the least squares algorithms, but it can not track time varying parameters and has a poor convergence rate. In order to improve the track...The stochastic gradient (SG) algorithm has less of a computational burden than the least squares algorithms, but it can not track time varying parameters and has a poor convergence rate. In order to improve the tracking properties of the SG algorithm, the forgetting gradient (FG) algorithm is presented, and its convergence is analyzed by using the martingale hyperconvergence theorem. The results show that: (1) for time invariant deterministic systems, the parameter estimates given by the FG algorithm converge consistently to their true values; (2) for stochastic time varying systems, the parameter tracking error is bounded, that is, the parameter tracking error is small when both the parameter change rate and the observation noise are small.展开更多
基金the National Natural Science Foundationof China!( No.6993 4 0 10)
文摘The stochastic gradient (SG) algorithm has less of a computational burden than the least squares algorithms, but it can not track time varying parameters and has a poor convergence rate. In order to improve the tracking properties of the SG algorithm, the forgetting gradient (FG) algorithm is presented, and its convergence is analyzed by using the martingale hyperconvergence theorem. The results show that: (1) for time invariant deterministic systems, the parameter estimates given by the FG algorithm converge consistently to their true values; (2) for stochastic time varying systems, the parameter tracking error is bounded, that is, the parameter tracking error is small when both the parameter change rate and the observation noise are small.