期刊文献+

基于蚁群优化的选择性集成数据流分类方法 被引量:1

Selective Ensemble Data Stream Classification Based on Ant Colony Optimization
下载PDF
导出
摘要 基于集成学习的数据流分类问题已成为当前研究热点之一,而集成学习存在集成规模大、训练时间长、时空复杂度高等不足,为此提出了一种基于蚁群优化的选择性集成数据流分类方法,用蚁群优化算法挑选出优秀的基分类器来构建集成分类模型。该方法首先对所有基分类器采用交叉验证计算分类精度,同时采用Gower相似系数求出基分类器之间的差异性,然后把分类精度和分类器差异性作为分类器挑选标准,从全部基分类器中选出一部分来构建集成模型,最终挑选的基分类器不仅具有良好的分类精度,同时保持一定差异性。在标准仿真数据集上对构建的集成分类模型进行仿真试验,结果表明,该方法与传统集成方法相比在准确率和稳定性方面均有显著提高。 Data streams classification based on ensemble learning has become one of the current research hotspots,which exists many disadvantages,such as the large volume,long training time and higher complexity.To solve these problems,the new selective ensemble data stream classification method is presented based on ant colony algorithm.The method uses the classification accuracy and the classifier’s difference as the classifier selection criteria,the ant colony optimization algorithm is used to select the base classifiers with high classification accuracy and large individual difference to construct the ensemble classification model.Lastly,simulation experiment is carried out on the standard dataset,the experiment results show that the presented method is superior to the traditional ensemble methods in classification accuracy and stability.
出处 《长江大学学报(自然科学版)》 CAS 2017年第5期37-43,共7页 Journal of Yangtze University(Natural Science Edition)
基金 国家自然科学基金项目(61300170) 安徽省自然科学基金项目(1608085MF147) 安徽省高校省级优秀人才重点项目(2013SQRL034ZD)
关键词 数据流分类 概念漂移 选择性集成 蚁群优化算法 差异性 data stream classification concept drift selective integration ant colony optimization difference
  • 相关文献

参考文献7

二级参考文献148

  • 1傅强,胡上序,赵胜颖.基于PSO算法的神经网络集成构造方法[J].浙江大学学报(工学版),2004,38(12):1596-1600. 被引量:18
  • 2李凯,黄厚宽.一种基于聚类技术的选择性神经网络集成方法[J].计算机研究与发展,2005,42(4):594-598. 被引量:24
  • 3王丽丽,苏德富.基于群体智能的选择性决策树分类器集成[J].计算机技术与发展,2006,16(12):55-57. 被引量:3
  • 4李青,焦李成.利用集成支撑矢量机提高分类性能[J].西安电子科技大学学报,2007,34(1):68-70. 被引量:6
  • 5HANSEN L K, SALAMON P. Neural network ensembles[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1990, 12(10) : 993 - 1001.
  • 6SCHAPIRE R E. The strength of weak learnability[J]. Machine Learning, 1990, 5(2) : 197 - 227.
  • 7FREUND Y. Boosting a weak algorithm by majority [J]. Information and Computation, 1995, 121 (2): 256 - 285.
  • 8BREIMAN L. Bagging predictors[J]. Machine Learning, 1996, 24(2): 123-140.
  • 9HO T K. The random subspace method for constructing decision forests[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1998, 20(8): 832- 844.
  • 10WANG Yao-nan, ZHANG Dong-bo, HUANG Hui- xian. Neural network ensemble based on rough sets reduction and selective strategy[C]// Proceedings of 7th World Congress on Intelligent Control and Automation. Chongqing: IEEE, 2008: 2033-2038.

共引文献201

同被引文献11

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部