期刊文献+

基于PSO的最小二乘支持向量机稀疏化算法 被引量:3

Optimal sparseness approach for least square support vector machine based on PSO
原文传递
导出
摘要 针对最小二乘支持向量机(LSSVM)失去稀疏特性及经典迭代剪切稀疏化算法容易陷入性能指标函数局部收敛的问题,提出一种基于粒子群优化(PSO)的LSSVM稀疏化算法.将LSSVM稀疏化过程描述为一个最优化问题,以校验样本和预测输出之间的均方根误差RMSE为优化目标,以模型训练样本剪切率ε(%)为优化变量.并针对此非线性优化问题提出基于PSO的求解方法.以大型电厂飞灰含碳量LSSVM模型为例,对此算法进行了实例研究.结果表明,该方法能有效解决经典算法的局部收敛问题获得最优剪切率,具有更好的预测和泛化能力. In term of the lack of sparseness of the least square support vector machine (LSSVM), some classic pruning methods are successively proposed. However, the local convergence characteristic of the performance index may hinder the classical methods to get an optimal pruning rate of the training data set. This paper proposes a particle swarm optimization (PSO) based optimal sparseness approach for LSSVM models. The new approach firstly formulates the sparseness of LSSVM as a general optimization problem, where root-mean-square error (RMSE) between predicted values and real values is taken as an objective function for minimization; and then the pruning rate is employed as optimization variable, PSO is further proposed to solve this nonlinear optimization problem. A LSSVM model of carbon content in fly ash is taken as a case study. The operation data of a large-scale coal-fired power plant is collected for comparative investigation. The results show that the newly proposed approach has the ability to conquer the local conver- gence problem; and consequently an optimal pruning rate of the training data set is obtained. Compared with the classic methods, the new approach is better both in prediction performance and generalization performance.
出处 《武汉大学学报(工学版)》 CAS CSCD 北大核心 2016年第6期955-960,共6页 Engineering Journal of Wuhan University
基金 国家自然科学基金资助项目(编号:51475337) 湖北省自然科学基金资助项目(编号:2011CDB277)
关键词 最小二乘支持向量机 最优稀疏化 粒子群优化算法 局部收敛 least square support vector machine (LSSVM) optimal sparseness particle swarm optimiza-tion (PSO) local convergence
  • 相关文献

参考文献7

二级参考文献59

  • 1王海峰,胡德金.最小二乘支持向量机的一种稀疏化算法[J].计算机工程与应用,2005,41(33):68-70. 被引量:11
  • 2张浩然,汪晓东.回归最小二乘支持向量机的增量和在线式学习算法[J].计算机学报,2006,29(3):400-406. 被引量:111
  • 3陈爱军,宋执环,李平.基于矢量基学习的最小二乘支持向量机建模[J].控制理论与应用,2007,24(1):1-5. 被引量:21
  • 4甘良志,孙宗海,孙优贤.稀疏最小二乘支持向量机[J].浙江大学学报(工学版),2007,41(2):245-248. 被引量:27
  • 5Kuh A, Wilde P D. Comments on "pruning error minimization in least squares support vector machines" [J]. IEEE Transactions on Neural Networks, 2007, 18(2): 606-609.
  • 6Zeng X Y, Chen X W. SMO-based pruning methods for sparse least squares support vector machines[J]. IEEE Transactions on Neural Networks, 2005, 16(6): 1541 1546.
  • 7Jiao L C, Bo L F, Wang L. Fast sparse approximation for least squares support vector machine[J]. IEEE Transactions on Neural Networks, 2007, 18(3): 685-697.
  • 8Zhao Y, Keong K C. Fast leave-one-out evaluation and improvement on inference for LS-SVMs[C]//Proceedings of the 17th International Conference on Pattern Recognition, Los Alamitos, CA, USA, 2004: 494-497.
  • 9An S J, Liu W Q, Venkatesh S. Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression[J]. Pattern Recognition, 2007, 40(8): 2154-2162.
  • 10Vapnik V N. The Nature of Statistical Learning Theory[M]. New York: Springer-Verlag, 1995.

共引文献67

同被引文献31

引证文献3

二级引证文献8

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部