摘要
针对参数化优化问题需要反复求解以适应模型参数变化,从而累积大量经验数据的特点,提出可根据经验数据估计新参数所对应最优解的思想。由于所考虑的优化问题往往涉及多个参数的扰动或变化,最优解的估计问题可以考虑为多元散乱数据拟合问题,本文采用最小二乘支持向量回归(LS-SVR)方法实现了该问题的一般解法。选用无需额外设置参数的无穷结点线性样条核,并提出了集成两层优化框架的改进LS-SVR方法,完全避免了人为整定算法调节参数带来的误差。通过参数化蒸馏塔优化模型的数值实验.对比了改进的LS-SVR与最近邻插值方法、径向基网络的估计效果,结果表明,改进的LS-SVR方法所估计的结果最贴近真实值。改进的LS-SVR方法用优化算法替代人为的参数整定,能够高效准确地估计参数化优化问题的最优解,从而明显提高优化的求解效率。
Parametric optimization problems require repeated solving to adapt changes of the model parameters, and a large amount of empirical data can be accumulated. A new approach is proposed in this paper, which estimates the optimal solution according to the new parameters based on the empirical data. Since the considered optimization problems often involve multiple-parameter disturbance or change, the problem of estimating the optimal solution is considered as a multivariate scattered data fitting problem. The least squares support vector regression (LS-SVR) method is applied to achieve the solution. The artificial tuning of the algorithm option parameters are completely avoided because the infinite-node linear spline kernel without setting parameters is chosen and an integrated two-layer optimization framework is proposed. According to the numerical experiments about the parametric optimization model of a distillation column, the estimation efficiency of the improved LS-SVR, the nearest neighbor interpolation method and RBF network estimation is compared. The numerical results illustrate that the estimated values from the improved LS-SVR method are most close to the real ones. The improved LS-SVR method adopts optimization algorithms to replace artificial parameter tuning, which results in high efficiency of the optimal estimation in parametric optimization problems.
出处
《计算机与应用化学》
CAS
CSCD
北大核心
2013年第11期1329-1332,共4页
Computers and Applied Chemistry
基金
河北省科技支撑项目(13210302D)
河北省科学院高层次人才资助项目(2013045333-9)
关键词
最小二乘支持向量机
散乱数据拟合
参数化优化
Least squares support vector regression
scatter data fitting
parametric optimization