期刊文献+

基于禁忌搜索的神经网络集成 被引量:3

Neural network ensembles based on tabu search
下载PDF
导出
摘要 给出了一种利用禁忌搜索来实现神经网络集成AdaBoosting算法的方法。以Ada- Boosting算法中的权值向量为优化对象,对其若干个元素进行扰动,并通过设定其为禁忌对象,以禁忌搜索控制寻优方向,以逼近误差为适值函数,在满足逼近误差或达到进化代数时结束进化。由于禁忌搜索可以避免迂回搜索,从而提高进化效率,使得算法易收敛。通过实例验证了该方法的可行性。 A method of implementation AdaBoosting algorithm with tabu search is presented. The weight vector of AdaBoosting algorithm is the optimization object, Which disturbs some of its elements and it is the tabu object. The approach error is fitness function. Tabu search controls the optimization direction, finishes the search when the approach error is contended or achieves the steps of evolution. Because tabu search can avoid outflanking search, it consequently improves the efficiency of evolution and the algorithm is easy for convergence. Example is used to verify the feasibility.
出处 《陕西理工学院学报(自然科学版)》 2007年第1期24-26,32,共4页 Journal of Shananxi University of Technology:Natural Science Edition
关键词 神经网络集成 AdaBoosting算法 禁忌搜索 neural network ensembles AdaBoosting algorithm tabu search
  • 相关文献

参考文献5

二级参考文献14

  • 1[1]Cherkassky V,Shepherd R.Regularization effect of weight initialization in back propagation networks[C].In:1998 World Congress on Computational Intelligence, 1998:2258~2261
  • 2[2]Jacobs R A.Increased rates of convergence through learning rate adaptation[J].Neural Networks, 1988; 1 (4) :295~307
  • 3[3]Tollenaere T.SuperSAB:fast adaptive back-propagation with good scaling properties[J]. Neural Networks, 1990;3(5) :561~573
  • 4[4]Wilson D R, Martinez T R.The need for small learning rates on large problems[C].In:Intemational Joint Conference on Neural Network,2001: 115~119
  • 5[5]Weigend A S,Rumelhart D E,Huberman B A.Generalization by weightelimination with application to forecasting[C].In:Advances in Neural Information Processing Systems,San Mateo,CA :Morgan Kaufmann, 1991:875~882
  • 6[6]Bo S.Optimal weight decay in perceptron[C].In:Proc of the International Conference on Neural Networks,1996:551~556
  • 7[7]Atiya A,Ji C Y.How initial eonditions affect generalization performance in large networks[J].IEEE Trans on Neural Networks,1997;8(2) :448~451
  • 8Hansen L K, Salamon P. Neural network ensembles[J]. IEEE Trans Pattern Analysis and Machine Intelligence, 1990, 12(10):993- 1001.
  • 9Schapire R E, Freund Y, Bartlett Y, et al. Boosting the margin: a new explanation for the effectiveness of voting methods[J]. The Annals of Statistics, 1998, 26(5):1651 - 1686.
  • 10Schapire R E,The strength of weak learnability[J]. Machine Learning, 1990, 5:197 - 227.

共引文献253

同被引文献17

引证文献3

二级引证文献13

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部