期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
CONVERGENCE ANALYSIS OF A LOCALLY ACCELERATED PRECONDITIONED STEEPEST DESCENT METHOD FOR HERMITIAN-DEFINITE GENERALIZED EIGENVALUE PROBLEMS
1
作者 Yunfeng Cai Zhaojun Bai +1 位作者 John E. Pask n. sukumar 《Journal of Computational Mathematics》 SCIE CSCD 2018年第5期739-760,共22页
By extending the classical analysis techniques due to Samokish, Faddeev and Faddee- va, and Longsine and McCormick among others, we prove the convergence of the precon- ditioned steepest descent with implicit deflati... By extending the classical analysis techniques due to Samokish, Faddeev and Faddee- va, and Longsine and McCormick among others, we prove the convergence of the precon- ditioned steepest descent with implicit deflation (PSD-id) method for solving Hermitian- definite generalized eigenvalue problems. Furthermore, we derive a nonasymptotie estimate of the rate of convergence of the PSD-id method. We show that with a proper choice of the shift, the indefinite shift-and-invert preconditioner is a locally accelerated preconditioner, and is asymptotically optimal which leads to superlinear convergence Numerical examples are presented to verify the theoretical results on the convergence behavior of the PSD- id method for solving ill-conditioned Hermitian-definite generalized eigenvalue problems arising from electronic structure calculations. While rigorous and full-scale convergence proofs of preconditioned block steepest descent methods in practical use still largely eludes us, we believe the theoretical results presented in this paper shed light on an improved understanding of the convergence behavior of these block methods. 展开更多
关键词 Eigenvalue problem Steepest descent method PRECONDITIONING Superlinear convergence.
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部