In this paper, a gradient method with momentum for sigma-pi-sigma neural networks (SPSNN) is considered in order to accelerate the convergence of the learning procedure for the network weights. The momentum coefficien...In this paper, a gradient method with momentum for sigma-pi-sigma neural networks (SPSNN) is considered in order to accelerate the convergence of the learning procedure for the network weights. The momentum coefficient is chosen in an adaptive manner, and the corresponding weak convergence and strong convergence results are proved.展开更多
Using the Faddeev-Jackiw (FJ) quantization method, this paper treats the CP^1nonlinear sigma model with ChernSimons term. The generalized FJ brackets are obtained in the framework of this quantization method, which ...Using the Faddeev-Jackiw (FJ) quantization method, this paper treats the CP^1nonlinear sigma model with ChernSimons term. The generalized FJ brackets are obtained in the framework of this quantization method, which agree with the results obtained by using the Dirac's method.展开更多
文摘In this paper, a gradient method with momentum for sigma-pi-sigma neural networks (SPSNN) is considered in order to accelerate the convergence of the learning procedure for the network weights. The momentum coefficient is chosen in an adaptive manner, and the corresponding weak convergence and strong convergence results are proved.
文摘Using the Faddeev-Jackiw (FJ) quantization method, this paper treats the CP^1nonlinear sigma model with ChernSimons term. The generalized FJ brackets are obtained in the framework of this quantization method, which agree with the results obtained by using the Dirac's method.