摘要
多分类器集成方法往往能够获得比单个分类器更好的泛化精度,为了解决Bagging和Boosting等集成算法中分类器选择的盲目性和随机性,提出了一种新的神经网络集成方法。在分析神经网络集成泛化误差公式的基础上,利用粒子群算法进行特征选择并保存特征选择的最优解和次优解,引入差异度思想进行基分类器的选择性集成,从而尽量减小集成个体的泛化误差和增大集成的差异度。经计算机仿真研究证明,与Bagging和Boosting集成算法相比,新算法在分类识别中具有较好的泛化性能。
Compared with a single classifier, muhi-classifier fusion methods have better generalization perform- ance. In order to resolve the blindness and randomness caused by classifier selections such as bagging and boosting algorithms, a new algorithm of Neural Network Ensemble was proposed. Firstly, with the analysis of the generalization error of Neural Network Ensemble, the Particle Swarm Optimization was put forward to obtain the optimum and sub- optimum solutions of the feature sets. Secondly, in order to reduce the generalization error and increase the difference degree of the ensemble individuals, the base classifiers were assembled by introducing the difference degree. Com- pared with the algorithm of Bagging and Boosting, the computer simulation results show that the generalization per- formance of this new algorithm is feasible in engineering application.
出处
《计算机仿真》
CSCD
北大核心
2013年第11期361-364,387,共5页
Computer Simulation
基金
国家自然科学基金项目(D201213)
关键词
神经网络集成
粒子群算法
泛化理论
调制识别
Neural network ensemble
Particle swarm optimization(PSO)
generalization theory
Modulation recog-nition