期刊文献+

克隆规划-交叉验证参数优化的LSSVM及惯性器件预测 被引量:10

Least square support vector machine based on parameters optimization of clone programming-cross validation and inertial component forecasting
下载PDF
导出
摘要 为改善最小二乘支持向量机的泛化性能,将克隆规划、交叉验证相结合的优化算法用于最小二乘支持向量机的参数优化.克隆规划算法是具有局部、全局搜索能力的优化算法,能有效避免陷入局部极值;交叉验证算法的无偏估计性抑制了训练过程中“过拟合”和“欠拟合”.在该优化算法中,用交叉验证误差构造抗体抗原亲合度,用克隆规划算法寻找最小二乘支持向量机的最优参数.用优化的最小二乘支持向量机回归模型建立了惯性器件时间序列预测模型.实验结果验证了算法的有效性及预测模型的泛化性能.预测模型为动态补偿、故障预测提供了依据. For improving the generalization ability of the least square support vector machine (LSSVM), the parameter optimization algorithm of clone programming-cross validation is employed to select optimal parameters of LSSVM. The clone programming algorithm has the superior capability in local and global search, and local minimums are refrained efficiently; cross validation has the unbiased estimator property, and therefore, the problems such as over training or insufficient training are avoided. In the optimization algorithm, the avidity function is constructed by the cross validation error, and moreover, optimal parameters of LSSVM are chosen by the clone programming algorithm. The time series forecasting model of the inertial component is built with LSSVM. Experimental results prove the effectiveness of the optimization algorithm and generalization ability of the forecasting model, and the forecasting model provides a support on dynamic compensation and fault forecasting of the inertial component.
出处 《西安电子科技大学学报》 EI CAS CSCD 北大核心 2007年第3期428-432,437,共6页 Journal of Xidian University
基金 国家部委预研课题资助(203020301)
关键词 克隆规划 交叉验证 参数优化 最小二乘支持向量机 惯性器件预测 clone programing cross validation parameters optimization least square support vector machine inertial component forecasting
  • 相关文献

参考文献11

  • 1Chen Chen,Pei Changxin,Zhu Changhua,et al.A Time Series Model for Accurately Predicting the WLAN Traffic[J].Journal of Xidian University,2006,33(3):337-340.
  • 2Chapelle O,Vapnik V N,Bousquet O,et al.Choosing Multiple Parameters for Support Vector Machines[J].Machine Learning,2002,46(1):131-159.
  • 3Vapnik V N,Chapell O.Bounds on Error Expectation for Support Vector Machine[J].Neural Computation,2000,12(9):2 013-2 036.
  • 4朱永生,王成栋,张优云.二次损失函数支持向量机性能的研究[J].计算机学报,2003,26(8):982-989. 被引量:8
  • 5刘威,李小平,毛慧欧,柴天佑.基于实数编码遗传算法的神经网络成本预测模型及其应用[J].控制理论与应用,2004,21(3):423-426. 被引量:12
  • 6Andrew W M,Mary S L.Efficient Algorithms for Minimizing Cross Validation Error[C]//International Conference on Machine Learning.San Francisco:Morgan Kaufman Publishers,1994:190-198.
  • 7Lee M M S,Keerthi S S,Ong C J,et al.An Efficient Method for Computing Leave-one-out Error in Support Vector Machines with Gaussian Kernels[J].IEEE Trans on Neural Networks,2004,15(3):750-757.
  • 8DUHaifeng,GONGMaoguo,JIAOLicheng,LIURuochen.A novel algorithm of artificial immune system for high-dimensional function numerical optimization[J].Progress in Natural Science:Materials International,2005,15(5):463-471. 被引量:18
  • 9Kim J,Bentley P.Towards an Artificial Immune System for Network Intrusion Detection:An Investigation of Dynamic Clonal Selection[C]//Proceedings of Congress on Evolutionary Computation.Washington:IEEE Press,2002:1 015-1 020.
  • 10Suykens J A K,Vandewalle J.Least Squares Support Vector Machine Classifiers[J].Neural Proceeding Letters,1999,9(3):293-300.

二级参考文献27

  • 1周明 孙树栋.遗传算法原理及应用[M].西安:西安交通大学出版社,2000..
  • 2Vapnike V N. The Nature of Statistical Learning Theory.New York : Springer-Verlag, 1998.
  • 3Chapelle O, Vapnik V N, Bousquet O etal. Choosing multiple parameters for support vector machines. Machine Learning,2002, 46(1) :131-159.
  • 4Duan K, Keerrthi S S, AN Poo. Evaluation of simple performance measures for tuning SVM hyperparameters. Department of Mechanical Engineering, National University of Singapore:Technical Report, Control Division Technical Report CD-01-11, 2001.
  • 5Keerrthi S S. Efficient tuning of SVM hyperparameters using radius/margin bound and iterative algorithms. Department of Mechanical Engineering, National University of Singapore:Technical Report, Control Division Technical Report CD-01-12, 2001.
  • 6Vapnik V N, Chapelle O. Bounds on error expectation for support vector machine. Neural Computation, 2000,12 (9) : 2013-2036.
  • 7Burges C J C. A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery, 1998, 2(2) :121-167.
  • 8Scholkopf B, Mika S, Burges CJ C etal. Input spaces vs. feature space in kernel-based methods. IEEE Transactions on Neural Networks, 1999,10(5) : 1000- 1017.
  • 9Burges C J C, Scholkopf B. Improving the accuracy and speed of support vector machines. Neural Information Processing Systems, 1997,9(7) :375-381.
  • 10Suykens J A K, Lukas L, Vandewalle J. Sparse least squares support vector machine classifiers. In: Proceedings of the IEEE International Symposium on Circuits and Systems (ISCAS 2000), Geneva, Switzerland, 2000. 2757-2760.

共引文献47

同被引文献90

引证文献10

二级引证文献81

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部