期刊文献+

基于树型正交前向选择方法的可调核函数模型

Tunable Kernel Model Based on Orthogonal Forward Selection with Tree Structure Search
下载PDF
导出
摘要 基于留一准则的正交前向选择算法(Orthogonal Forward Selection based on Leave-One-Out Criteria,OFS-LOO)是最近提出的一种数据建模方法,它能够产生鲁棒性好的参数可调的核函数回归模型。OFS-LOO采用贪婪算法策略,利用全局优化算法逐项调节每个回归项的参数,逐步地增加模型的项数,减少留一准则函数值。但是OFS-LOO仅保留当前最优解作为新回归项的参数,而忽略当前的选择对以后步骤的影响,破坏了模型的稀疏性。本文在OFS-LOO的框架下提出了一种新颖的树型算法。在选择核函数模型的每一项时,采用重复加权增进搜索(Repeated Weighted Boosting Search,RWBS)算法,同时保留RWBS得到的多个局部极值作为核函数参数的候选项。新方法试图找到传统OFS-LOO和全局最优解之间的折衷。实验表明,与传统方法相比,新方法得到的核函数模型稀疏性更好,泛化能力更强。 Orthogonal Forward Selection based on Leave-One-Out Criteria(OFS-LOO) is recently proposed as an excellent tool for data modeling,which is capable of producing robust kernel model with tunable parameters.OFS-LOO adapts greedy scheme,which utilizes some global search algorithm to tune the kernel model term by term by minimizing LOO criteria.However,it is well known that the greedy algorithm only seeks the best performance in the current stage,and ignores its effect on the next stage.Nevertheless the selection of a particular regressor will surely have significant impact on the tuning of the regressor in the next stage.In this paper,a novel tree structure search is incorporated into the framework of OFS-LOO.The new method adopts repeated weighted boosting search (RWBS) algorithm.At each regressor,multiple optima are kept as the candidates of the parameters of the new regressor rather than only the best one is retained as the OFS-LOO does.This enhanced OFS-LOO provides a good compromise between an exhaustive search on all basis function parameters and a non-optimal a priori choice.The numerical results show that,compared to the traditional methods, the new approach can produce the kernel models with much more sparsity and better generality.
出处 《信号处理》 CSCD 北大核心 2011年第10期1576-1580,共5页 Journal of Signal Processing
基金 国家自然科学基金项目(11026145,61102103,61071188,90920005) 湖北省自然科学基金项目(2010CDB04205,2009CDB077) 中央高校基本科研业务费专项资金资助(CUG090112,CUG110407,CCNU10A01013) 河北省教育厅自然科学青年基金2010258
关键词 正交前向选择 核函数模型 树型搜索 Orthogonal forward selection kernel model tree structure search
  • 相关文献

参考文献8

  • 1A. Smola. Regression Estimation with Support Vector Learn- ing Machines[ D]. Master's Thesis, Technische University Mfinchen, 1996.
  • 2S. Chen, X. Hong, C. J. Harris and P. M. Sharkey. Sparse modeling using orthogonal forward regression with PRESS statistic and regulation [ J ]. IEEE Trans. Syst. Man Cybern. B, Cybern. , 2004, 34(2):898-911.
  • 3S. Chen, X. X. Wang and D. J. Brown. Orthogonal least squares regression with tunable kernels [ J ]. Electronics Letters, 2005, 41 (8) :484-486.
  • 4S. Chen, X. Hong, C., B. L. Luk and J. Harris. Construc- tion of tunable radial basis function networks using orthog- onal forward selection [ J ]. IEEE Trans. Syst. Man Cy- bern. B, Cybern. , 2009, 39(2):457-466.
  • 5S. Chen, X. X. Wang and C. J. Harris. Experiments with repeating weighted boosting search for optimization in sig- nal processing applications [ J ]. IEEE Trans. Syst. Man Cybern. B, Cybern., 2005, 35(4):682-693.
  • 6J. B. Gao, D. Shi and X. M. Liu. Significant vector learning to construct sparse kernel regression models [ J ]. Neural Networks, 2007, 20 (7) :791-798.
  • 7S. Chen. local regularization assisted orthogonal least regres- sion[J], Neurocomputing, 2006, 69:559-585.
  • 8D. zheng, J. Wang, and Y. Zhao. Non-tim function esti- mation with a multi-scale support vector regression, Neu- rocomputing, 2006, 70 ( 1 ) : 420- 429.

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部