摘要
利用差商代替难以计算的精确导数,结合既约梯度法的思想建立新的算法;在目标函数一致凸的条件下证明了既约差商法的整体收敛性.
The difference coefficient was used to replace the exact derivative which is difficult to be computed, and a new algorithm was presented by using the idea of reduced gradient method. The reduced difference coefficient algorithm was shown to possess global convergence if the objective function is uniformly convex.
出处
《广西科学》
CAS
1995年第2期6-9,共4页
Guangxi Sciences
基金
广西大学青年科学基金
关键词
约束最优化
既约差商
差商
整体收敛性
一致凸
constrained optimization
reduced difference coefficient, difference coefficient, global convergence
uniformly convex