期刊文献+

A Heterogeneous Ensemble of Extreme Learning Machines with Correntropy and Negative Correlation 被引量:1

A Heterogeneous Ensemble of Extreme Learning Machines with Correntropy and Negative Correlation
原文传递
导出
摘要 The Extreme Learning Machine(ELM) is an effective learning algorithm for a Single-Layer Feedforward Network(SLFN). It performs well in managing some problems due to its fast learning speed. However, in practical applications, its performance might be affected by the noise in the training data. To tackle the noise issue, we propose a novel heterogeneous ensemble of ELMs in this article. Specifically, the correntropy is used to achieve insensitive performance to outliers, while implementing Negative Correlation Learning(NCL) to enhance diversity among the ensemble. The proposed Heterogeneous Ensemble of ELMs(HE2 LM) for classification has different ELM algorithms including the Regularized ELM(RELM), the Kernel ELM(KELM), and the L2-norm-optimized ELM(ELML2). The ensemble is constructed by training a randomly selected ELM classifier on a subset of the training data selected through random resampling. Then, the class label of unseen data is predicted using a maximum weighted sum approach. After splitting the training data into subsets, the proposed HE2 LM is tested through classification and regression tasks on real-world benchmark datasets and synthetic datasets. Hence, the simulation results show that compared with other algorithms, our proposed method can achieve higher prediction accuracy, better generalization, and less sensitivity to outliers. The Extreme Learning Machine(ELM) is an effective learning algorithm for a Single-Layer Feedforward Network(SLFN). It performs well in managing some problems due to its fast learning speed. However, in practical applications, its performance might be affected by the noise in the training data. To tackle the noise issue, we propose a novel heterogeneous ensemble of ELMs in this article. Specifically, the correntropy is used to achieve insensitive performance to outliers, while implementing Negative Correlation Learning(NCL) to enhance diversity among the ensemble. The proposed Heterogeneous Ensemble of ELMs(HE2 LM) for classification has different ELM algorithms including the Regularized ELM(RELM), the Kernel ELM(KELM), and the L2-norm-optimized ELM(ELML2). The ensemble is constructed by training a randomly selected ELM classifier on a subset of the training data selected through random resampling. Then, the class label of unseen data is predicted using a maximum weighted sum approach. After splitting the training data into subsets, the proposed HE2 LM is tested through classification and regression tasks on real-world benchmark datasets and synthetic datasets. Hence, the simulation results show that compared with other algorithms, our proposed method can achieve higher prediction accuracy, better generalization, and less sensitivity to outliers.
出处 《Tsinghua Science and Technology》 SCIE EI CAS CSCD 2017年第6期691-701,共11页 清华大学学报(自然科学版(英文版)
基金 supported by the National Natural Science Foundation of China(Nos.61174103 and61603032) the National Key Technologies R&D Program of China(No.2015BAK38B01) the National Key Research and Development Program of China(No.2017YFB0702300) the China Postdoctoral Science Foundation(No.2016M590048) the University of Science and Technology Beijing–Taipei University of Technology Joint Research Program(TW201705)
关键词 Extreme Learning Machine(ELM) ensemble classification correntropy negative correlation Extreme Learning Machine(ELM) ensemble classification correntropy negative correlation
  • 相关文献

参考文献2

二级参考文献21

  • 1Lander E S. Array of hope[J]. Nature Genetics, 1999, 21: 3-4.
  • 2Guang-Bin Huang, Qin-Yu Zhu, Chee-kheong Siew. Extreme learning machine: a new learning scheme of feedforward neural networks[C]//Proceedings of International Joint Conference on Neural Networks(IJCNN2004), Budapest, Hungary, July, 2004.
  • 3Guang-Bin Huang, Dianhui Wang, Yuan Lan. Extreme learning machines: a survey[J]. International Journal of Machine Learning and Cybernetics, 2011, 2(1): 107-122.
  • 4Zhao G, Shen Z, Miao C, Man Z, et al. On improving the conditioning of extreme learning ma- chine: A linear case[C]//Tth International Conference on Information, Communications and Signal Processing, Macao, December, 2009: 1-5.
  • 5Suresh S., Saraswathi S., Sundararajan N.. Performance enhancement of extreme learning machine for multi-category sparse data classification problems[J]. Engineering Applications of Artificial In- telligence, 2010, 23:1149-1157.
  • 6Hansen L K, Salamon P. Neural Network Ensembles[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1990, 12 : 993-1001.
  • 7Sun Z L, Choi T M, Au K F, et al. Sales forecasting using extreme learning machine with applications in fashion retailing[J]. Decision Support Systems, 2008, 46 : 411-419.
  • 8van Heeswijk M, Miche Y, Lindh-Knuutila T, et al. Adaptive ensemble models of extreme learning machines for time series prediction[C]//19th International Conference, Limassol, Cyprus, Septem- ber, 14-17, 2009, 5769 : 305-314.
  • 9Van Heeswijk M, Miche Y, Oja E, et al. Gpu-accelerated and parallelized ELM ensembles for large-scale regression[J]. To appear in Neurocomputing, 2011.
  • 10Guang-Bin Huang, Qin-Yu Zhu, Chee-kheong Siew. Extreme learning machine:Theory and appli- cations[J], Neurocomputing, 2006, 70: 489-501.

共引文献7

同被引文献16

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部