期刊文献+

选择性集成极限学习机分类器建模研究 被引量:3

RESEARCH ON MODELLING SELECTIVE ENSEMBLE EXTREME LEARNING MACHINE CLASSIFIER
下载PDF
导出
摘要 极限学习机ELM(Extreme Learning Machine)具有训练过程极为快速的优点,但在实际分类应用中ELM分类器的分类精度和稳定性有时并不能满足要求。针对这一问题,在ELM用于分类时引入一种训练结果信息量评价指标来改进输出权值矩阵的求解方法,并增加隐层输出矩阵竞争机制来提高ELM的稳定性。为了进一步提高ELM的分类正确率,借鉴神经网络集成的理论,提出一种选择性集成ELM分类器。在集成方法中采用改进Bagging法并提出一种基于网络参数向量的相似度评价方法和选择性集成策略。最后通过UCI数据测试表明,同Bagging法和传统的全集成法相比,该方法拥有更为优秀的分类性能。 As its advantage, the training speed of extreme learning machine (ELM) is extremely fast. But sometimes its stability and precision can' t meet the requirement of practical application. In order to solve the problem, this paper introduces a solution for ELM when to be used in classification, in it the output weight matrix is improved with the evaluation factor of information in training results. Meanwhile, the hidden layer output matrixes competitive mechanism is added to improve the stability of ELM. For the sake of further improving ELM' s accuracy rate in classification, we propose a kind of selective ensemble extreme learning machine classifier by learning from the theory of neural network ensemble. In ensemble method, we adopt the improved Bagging and propose a subnet' s parameter vector-based similarity evaluation method and selective ensemble policy. Finally it is demonstrated by UCI data test that compared with Bagging and traditional all ensemble ELM, the solution proposed here has better performance in classification.
出处 《计算机应用与软件》 CSCD 2016年第9期279-283,共5页 Computer Applications and Software
基金 国家粮食局公益性科研项目(201313012)
关键词 极限学习机 神经网络 选择性集成 BAGGING Extreme learning machine Neural network Selective ensemble Bagging
  • 相关文献

参考文献14

  • 1Huang G B, Zhu Q Y, Siew C K. Extreme learning machine:theory and application [ J 3. Neurocomputing,2006,70 ( 1 - 3 ) :489 - 501.
  • 2Huang G B, Wang D H, Lan Y. Extreme learning machines: a survey [ J]. International Journal of Machine Learning and Cybernetics,2011, 2(2) : 107 - 122.
  • 3周志华,陈世福.神经网络集成[J].计算机学报,2002,25(1):1-8. 被引量:245
  • 4Zhou Z H,Wu J,Tang W. Ensembling neural networks:many could be better than all [ J ]. Artificial intelligence ,2002,137 ( 1 ) : 239 - 263.
  • 5Bartlett P L. The sample complexity of pattern classification with neural networks:the size of the weights is more important than the size of the network [ J]. IEEE Transactions on Information Theory, 1998,44 ( 2 ) : 525 - 536.
  • 6Shipp C A, Kuneheva L I. Relationships between combination methods and measures of diversity in combining classifiers [ J ]. Information Fu- sion,2002,3 (2) : 135 - 148.
  • 7Yang J, Yang J Y, Zhang D, et al. Feature fusion : parallel strategy vs. serial strategy [ J ]. Pattern Recognition,2003,36 ( 6 ) : 1369 - 1381. .
  • 8Zhu Q Y ,Qin A K,Suganthan P N ,et al. Evolutionary extreme learning machine[ J]. Pattern Recognition ,2005,38 (10) : 1759 - 1763.
  • 9陈涵瀛,高璞珍,谭思超,付学宽.自然循环流动不稳定性的多目标优化极限学习机预测方法[J].物理学报,2014,63(20):107-114. 被引量:5
  • 10胥小波,郑康锋,李丹,武斌,杨义先.新的混沌粒子群优化算法[J].通信学报,2012,33(1):24-30. 被引量:125

二级参考文献60

共引文献583

同被引文献22

引证文献3

二级引证文献20

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部