摘要
基于负相关异构网络,提出了一种增量构造异构神经网络集成(NNE)的方法.该方法在训练成员网络时,不仅调整网络的连接权值,而且动态调整网络的结构,从而在提高单个网络精度的同时增加各成员网络之间的差异度,减小网络集成的泛化误差.该方法包括构造最佳异构网络(BHNN)和构造异构网络集成(HNNE)两个部分,BHNN基于负相关学习动态构造多个最佳网络,HNNE利用训练好的最佳网络增量地构造异构NNE.使用网络泛化误差和集成泛化误差,整个集成过程可自动完成,无需预先确定成员网络的结构.分别对回归和分类问题进行了实验,相对于单个网络,该方法在测试数据集上的错误率降低了17%~85%,与已有的Boosting、Bagging等网络集成方法相比,错误率也有不同程度的改善.
A new method for constructing a heterogeneous neural network ensemble (HNNE) based on heterogeneous neural network with negative correlation was presented. In training member network, the proposed method adjusts both the network architecture of the individual networks and the connection weights. So it improves the accuracy of the member neural networks while increasing the diversity among member networks and decreasing the generalization error of network ensemble. The method consists of two parts; constructing the best heterogeneous neural networks (BHNN) and constructing HNNE. The former constructs dynamically many best neural networks based on negative learning; the latter incrementally constructs neural network ensemble (NNE) by using the learned best networks. The whole ensemble process can be completed automatically with the predefined network generalization error and the NNE generalization error without priori knowledge of specifying the network structure. Experiments results with regression and classification problems show that the error rate can be improved by HNNE from 17% to 85%, better than Boosting, Bagging and other network ensemble methods.
出处
《西安交通大学学报》
EI
CAS
CSCD
北大核心
2004年第8期796-799,共4页
Journal of Xi'an Jiaotong University
基金
国家高技术研究发展计划资助项目 (2 0 0 3AA1Z2 61 0 )