This paper describes negative correlation learning for designing neural network ensembles. Negative correlation learning has been firstly analysed in terms of minimising mutual information on a regression task. By min...This paper describes negative correlation learning for designing neural network ensembles. Negative correlation learning has been firstly analysed in terms of minimising mutual information on a regression task. By minimising the mutual information between variables extracted by two neural networks, they are forced to convey different information a-bout some features of their input. Based on the decision boundaries and correct response sets, negative correlation learning has been further studied on two pattern classification problems. The purpose of examining the decision boundaries and the correct response sets is not only to illustrate the learning behavior of negative correlation learning, but also to cast light on how to design more effective neural network ensembles. The experimental results showed the decision boundary of the trained neural network ensemble by negative correlation learning is almost as good as the optimum decision boundary.展开更多
The Extreme Learning Machine(ELM) is an effective learning algorithm for a Single-Layer Feedforward Network(SLFN). It performs well in managing some problems due to its fast learning speed. However, in practical a...The Extreme Learning Machine(ELM) is an effective learning algorithm for a Single-Layer Feedforward Network(SLFN). It performs well in managing some problems due to its fast learning speed. However, in practical applications, its performance might be affected by the noise in the training data. To tackle the noise issue, we propose a novel heterogeneous ensemble of ELMs in this article. Specifically, the correntropy is used to achieve insensitive performance to outliers, while implementing Negative Correlation Learning(NCL) to enhance diversity among the ensemble. The proposed Heterogeneous Ensemble of ELMs(HE2 LM) for classification has different ELM algorithms including the Regularized ELM(RELM), the Kernel ELM(KELM), and the L2-norm-optimized ELM(ELML2). The ensemble is constructed by training a randomly selected ELM classifier on a subset of the training data selected through random resampling. Then, the class label of unseen data is predicted using a maximum weighted sum approach. After splitting the training data into subsets, the proposed HE2 LM is tested through classification and regression tasks on real-world benchmark datasets and synthetic datasets. Hence, the simulation results show that compared with other algorithms, our proposed method can achieve higher prediction accuracy, better generalization, and less sensitivity to outliers.展开更多
基金Supported by the National Natural Science Foundation of China(60133010)
文摘This paper describes negative correlation learning for designing neural network ensembles. Negative correlation learning has been firstly analysed in terms of minimising mutual information on a regression task. By minimising the mutual information between variables extracted by two neural networks, they are forced to convey different information a-bout some features of their input. Based on the decision boundaries and correct response sets, negative correlation learning has been further studied on two pattern classification problems. The purpose of examining the decision boundaries and the correct response sets is not only to illustrate the learning behavior of negative correlation learning, but also to cast light on how to design more effective neural network ensembles. The experimental results showed the decision boundary of the trained neural network ensemble by negative correlation learning is almost as good as the optimum decision boundary.
基金supported by the National Natural Science Foundation of China(Nos.61174103 and61603032)the National Key Technologies R&D Program of China(No.2015BAK38B01)+2 种基金the National Key Research and Development Program of China(No.2017YFB0702300)the China Postdoctoral Science Foundation(No.2016M590048)the University of Science and Technology Beijing–Taipei University of Technology Joint Research Program(TW201705)
文摘The Extreme Learning Machine(ELM) is an effective learning algorithm for a Single-Layer Feedforward Network(SLFN). It performs well in managing some problems due to its fast learning speed. However, in practical applications, its performance might be affected by the noise in the training data. To tackle the noise issue, we propose a novel heterogeneous ensemble of ELMs in this article. Specifically, the correntropy is used to achieve insensitive performance to outliers, while implementing Negative Correlation Learning(NCL) to enhance diversity among the ensemble. The proposed Heterogeneous Ensemble of ELMs(HE2 LM) for classification has different ELM algorithms including the Regularized ELM(RELM), the Kernel ELM(KELM), and the L2-norm-optimized ELM(ELML2). The ensemble is constructed by training a randomly selected ELM classifier on a subset of the training data selected through random resampling. Then, the class label of unseen data is predicted using a maximum weighted sum approach. After splitting the training data into subsets, the proposed HE2 LM is tested through classification and regression tasks on real-world benchmark datasets and synthetic datasets. Hence, the simulation results show that compared with other algorithms, our proposed method can achieve higher prediction accuracy, better generalization, and less sensitivity to outliers.