期刊文献+

Improved Scheme for Fast Approximation to Least Squares Support Vector Regression

Improved Scheme for Fast Approximation to Least Squares Support Vector Regression
下载PDF
导出
摘要 The solution of normal least squares support vector regression(LSSVR)is lack of sparseness,which limits the real-time and hampers the wide applications to a certain degree.To overcome this obstacle,a scheme,named I2FSA-LSSVR,is proposed.Compared with the previously approximate algorithms,it not only adopts the partial reduction strategy but considers the influence between the previously selected support vectors and the willselected support vector during the process of computing the supporting weights.As a result,I2FSA-LSSVR reduces the number of support vectors and enhances the real-time.To confirm the feasibility and effectiveness of the proposed algorithm,experiments on benchmark data sets are conducted,whose results support the presented I2FSA-LSSVR. The solution of normal least squares support vector regression(LSSVR)is lack of sparseness,which limits the real-time and hampers the wide applications to a certain degree.To overcome this obstacle,a scheme,named I2FSA-LSSVR,is proposed.Compared with the previously approximate algorithms,it not only adopts the partial reduction strategy but considers the influence between the previously selected support vectors and the willselected support vector during the process of computing the supporting weights.As a result,I2FSA-LSSVR reduces the number of support vectors and enhances the real-time.To confirm the feasibility and effectiveness of the proposed algorithm,experiments on benchmark data sets are conducted,whose results support the presented I2FSA-LSSVR.
出处 《Transactions of Nanjing University of Aeronautics and Astronautics》 EI 2014年第4期413-419,共7页 南京航空航天大学学报(英文版)
基金 Supported by the National Natural Science Foundation of China(51006052)
关键词 support vector regression kernel method least squares SPARSENESS support vector regression kernel method least squares sparseness
  • 相关文献

参考文献1

二级参考文献17

  • 1Zeng X Y, Chen X W. SMO-based pruning methods for sparse least squares support vector machines[J]. IEEE Trans on Neural Networks, 2005, 16(6):1541-1546.
  • 2Jiao L C, Bo L F, Wang L. Fast sparse approximation for least squares support vector machine[J]. IEEE Trans on Neural Networks, 2007, 18(3): 685-697.
  • 3Zhao Y, Keong K C. Fast leave-one-out evaluation and improvement on inference for LS-SVMs[C]. Proc of the 17th Int Conf on Pattern Recognition. Cambridge: IEEE, 2004: 494-497.
  • 4An S J, Liu W Q, Venkatesh S. Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression[J]. Pattern Recognition, 2007, 40(8):2154-2162.
  • 5Murphu P M, Aha D W. UCI repository of machine learning database [Z]. http://www.ics. uei. edu/-mlearn/MLRepository.html.
  • 6Vapnik V N. The nature of statistical learning theory[M]. New York: Springer-Verlag, 1995.
  • 7Meng Z M, Peng L F, Zhou G G, et al. A multiclassification method of temporal data based on support vector machine[C]. Proc of LNAI. Berlin:Springer-Verlag, 2007: 240-249.
  • 8Huang C L, Chen M C, Wang C J. Credit scoring with a data mining approach based on support vector machines[J]. Expert Systems with Application, 2007,33(4): 847-856.
  • 9Suykens J A K, Vandewalle J. Least squares support vector machine classifiers [J]. Neural Processing Letters, 1999, 9(3):293-300.
  • 10Suykens J A K, Van Gestel T, De Brabanter J, et al. Least squares support vector maehines[M]. Singapore:World Scientific, 2002.

共引文献11

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部