摘要
研究了小样本数据集的神经网络分类器集成,提出了适合于小样本数据集的神经网络分类器集成方法NovelNNE,通过生成差异数据提高神经网络集成中个体的差异性,从而提高集成学习的泛化性能;最后应用不同的融合技术针对UCI标准数据集进行了实验研究.结果表明,在集成算法NovelNNE中,使用相对多数投票与贝叶斯融合方法的性能优于行为知识空间融合方法.
Ensemble learning has become a hot topic in machine learning. It dramatically improves the generalization performance of a classifier. In this paper, neural network ensemble for small data sets is studied and an approach to neural network ensemble (Novel-.NNE) is presented. For increasing ensemble diversity, a diverse data set is generated as part training set in order to create diverse neural network classifiers. Moreover, different combinational methods are studied for Novel_NNE. Experimental results show that Novel_NNE for both the relative majority vote method and the Bayes combinational method achieves higher predictive accuracy.
出处
《计算机研究与发展》
EI
CSCD
北大核心
2006年第7期1161-1166,共6页
Journal of Computer Research and Development
基金
国家自然科学基金项目(60443003)
河北大学博士基金项目(075)
关键词
神经网络集成
小规模数据集
差异性
泛化
neural network ensemble
small data set
diversity
generalization