期刊文献+
共找到3篇文章
< 1 >
每页显示 20 50 100
A Study on the Convergence of Gradient Method with Momentum for Sigma-Pi-Sigma Neural Networks 被引量:1
1
作者 Xun zhang naimin zhang 《Journal of Applied Mathematics and Physics》 2018年第4期880-887,共8页
In this paper, a gradient method with momentum for sigma-pi-sigma neural networks (SPSNN) is considered in order to accelerate the convergence of the learning procedure for the network weights. The momentum coefficien... In this paper, a gradient method with momentum for sigma-pi-sigma neural networks (SPSNN) is considered in order to accelerate the convergence of the learning procedure for the network weights. The momentum coefficient is chosen in an adaptive manner, and the corresponding weak convergence and strong convergence results are proved. 展开更多
关键词 Sigma-Pi-Sigma NEURAL Network MOMENTUM TERM GRADIENT Method CONVERGENCE
下载PDF
A Note on Parameterized Preconditioned Method for Singular Saddle Point Problems
2
作者 Yueyan Lv naimin zhang 《Journal of Applied Mathematics and Physics》 2016年第4期608-613,共6页
Recently, some authors (Li, Yang and Wu, 2014) studied the parameterized preconditioned HSS (PPHSS) method for solving saddle point problems. In this short note, we further discuss the PPHSS method for solving singula... Recently, some authors (Li, Yang and Wu, 2014) studied the parameterized preconditioned HSS (PPHSS) method for solving saddle point problems. In this short note, we further discuss the PPHSS method for solving singular saddle point problems. We prove the semi-convergence of the PPHSS method under some conditions. Numerical experiments are given to illustrate the efficiency of the method with appropriate parameters. 展开更多
关键词 Singular Saddle Point Problems Hermitian and Skew-Hermitian Splitting PRECONDITIONING Iteration Methods Semi-Convergence
下载PDF
CONVERGENCE OF GRADIENT METHOD WITH MOMENTUM FOR BACK-PROPAGATION NEURAL NETWORKS 被引量:5
3
作者 Wei Wu naimin zhang +2 位作者 Zhengxue Li Long Li Yan Liu 《Journal of Computational Mathematics》 SCIE EI CSCD 2008年第4期613-623,共11页
In this work, a gradient method with momentum for BP neural networks is considered. The momentum coefficient is chosen in an adaptive manner to accelerate and stabilize the learning procedure of the network weights. C... In this work, a gradient method with momentum for BP neural networks is considered. The momentum coefficient is chosen in an adaptive manner to accelerate and stabilize the learning procedure of the network weights. Corresponding convergence results are proved. 展开更多
关键词 Back-propagation (BP) neural networks Gradient method MOMENTUM Convergence.
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部