期刊文献+

基于改进最小二乘支持向量机的最优解估计方法 被引量:2

An optimal solution estimating method based on modified least squares support vector machines
原文传递
导出
摘要 针对参数化优化问题需要反复求解以适应模型参数变化,从而累积大量经验数据的特点,提出可根据经验数据估计新参数所对应最优解的思想。由于所考虑的优化问题往往涉及多个参数的扰动或变化,最优解的估计问题可以考虑为多元散乱数据拟合问题,本文采用最小二乘支持向量回归(LS-SVR)方法实现了该问题的一般解法。选用无需额外设置参数的无穷结点线性样条核,并提出了集成两层优化框架的改进LS-SVR方法,完全避免了人为整定算法调节参数带来的误差。通过参数化蒸馏塔优化模型的数值实验.对比了改进的LS-SVR与最近邻插值方法、径向基网络的估计效果,结果表明,改进的LS-SVR方法所估计的结果最贴近真实值。改进的LS-SVR方法用优化算法替代人为的参数整定,能够高效准确地估计参数化优化问题的最优解,从而明显提高优化的求解效率。 Parametric optimization problems require repeated solving to adapt changes of the model parameters, and a large amount of empirical data can be accumulated. A new approach is proposed in this paper, which estimates the optimal solution according to the new parameters based on the empirical data. Since the considered optimization problems often involve multiple-parameter disturbance or change, the problem of estimating the optimal solution is considered as a multivariate scattered data fitting problem. The least squares support vector regression (LS-SVR) method is applied to achieve the solution. The artificial tuning of the algorithm option parameters are completely avoided because the infinite-node linear spline kernel without setting parameters is chosen and an integrated two-layer optimization framework is proposed. According to the numerical experiments about the parametric optimization model of a distillation column, the estimation efficiency of the improved LS-SVR, the nearest neighbor interpolation method and RBF network estimation is compared. The numerical results illustrate that the estimated values from the improved LS-SVR method are most close to the real ones. The improved LS-SVR method adopts optimization algorithms to replace artificial parameter tuning, which results in high efficiency of the optimal estimation in parametric optimization problems.
出处 《计算机与应用化学》 CAS CSCD 北大核心 2013年第11期1329-1332,共4页 Computers and Applied Chemistry
基金 河北省科技支撑项目(13210302D) 河北省科学院高层次人才资助项目(2013045333-9)
关键词 最小二乘支持向量机 散乱数据拟合 参数化优化 Least squares support vector regression scatter data fitting parametric optimization
  • 相关文献

参考文献14

  • 1Fletcher R, Morton W, Initialising distillation column models. Computers & Chemical Engineering, 2000, 23(11-12): 1811-1824.
  • 2Baxter B J C. The Interpolation Theory of Radial Basis Function. Cambridge:Trinity College of Cambridge University, 1992.
  • 3Franke R. Scattered data interpolation: Tests of some method. Mathematics of Computation, 1982, 38(157): 181-200.
  • 4W~ichter A, Biegler L T. On the implementation of an interior-point filter line-search algorithm for large-scale nonlinear programming. Mathematical Programming, 2006, 106(1):25-57.
  • 5Fang Xueyi. Mnemonic Enhancement Optimization for Process System. Doctoral Thesis. Hangzhou:Zhejiang University, 2009.
  • 6Vapnik V N. Statistical Learning Theory. New York:John Wiley & Sons, 1998.
  • 7Wu Zongmin. Scattered Data Interpolation Model, Method and Theory. Beijing:Science Press, 2007.
  • 8Fourer R, Gay D M, Kernighan B W. AMPL-A Modeling Language for Mathematical Programming. Pacific Grove: Brooks / Cole - Thomson Learning, 2003.
  • 9Wang Z, Shao Z, et al. A modified mnemonic enhancement optimization method for solving parametric nonlinear program- ming problems//49th IEEE Conference on Decision and Control (CDC), Atlanta: 2010, 12:2210-2214.
  • 10Suykens J, van Gestel T, de Brabanter J, de Moor B, Vandewalle J. Least Squares Support Vector Machines. Singapore:World Scientific Pub. Co., 2002.

同被引文献16

引证文献2

二级引证文献5

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部