摘要
In this paper, we propose a globally convergent Polak-Ribiere-Polyak (PRP) conjugate gradient method for nonconvex minimization of differentiable functions by employing an Armijo-type line search which is simpler and less demanding than those defined in [4,10]. A favorite property of this method is that we can choose the initial stepsize as the one-dimensional minimizer of a quadratic modelΦ(t):= f(xk)+tgkTdk+(1/2) t2dkTQkdk, where Qk is a positive definite matrix that carries some second order information of the objective function f. So, this line search may make the stepsize tk more easily accepted. Preliminary numerical results show that this method is efficient.
In this paper, we propose a globally convergent Polak-Ribière-Polyak (PRP) conjugate gradient method for nonconvex minimization of differentiable functions by employing an Armijo-type line search which is simpler and less demanding than those defined in [4,10]. A favorite property of this method is that we can choose the initial stepsize as the one-dimensional minimizer of a quadratic model Φ(t) := f(xk) + tg^T kdk + 1/2t^2 dk^T kQkdk, where Qk is a positive definite matrix that carries some second order information of the objective function f. So, this line search may make the stepsize tk more easily accepted. Preliminary numerical results show that this method is efficient.
基金
This work is supported by the Chinese NSF grants 60475042 Guangxi NSF grants 0542043the Foundation of Advanced Research Center of Zhongshan University and Hong Kong
关键词
非约束最优化
共轭梯度法
整体收敛
可微函数
Unconstrained optimization
conjugate gradient method
nonconvex minimization
global convergence.