期刊文献+

数据流选择性集成的两阶段动态融合方法 被引量:1

Two-phase Dynamic Fusion Method for Data Stream Selective Integration
下载PDF
导出
摘要 选择性集成分类算法虽能提高集合分类器在整体数据集上的分类性能,但针对某一具体数据进行分类时,其选择出的个体分类器集合并不一定是最优组合。为此,从数据自适应角度出发,提出一种数据流选择性集成的两阶段动态融合方法,利用待分类数据所在特征空间中的位置,动态选择个体分类器集合,并对其进行分类。理论分析和实验结果表明,与GASEN算法相比,该方法的分类准确率更高。 Selective ensemble classifiers can improve classification accuracy rate of data set.But for a specific data classification,the classifiers contained by ensemble can not be the best combination.Proceed from adaptation of data,two-phase selective ensemble of data streams is presented.According to location of data in the eigenspace,individual classifier is selected to classify this data.Theories and empirical analyses indicate this algorithm has more classification accuracy rate,in contrast with GASEN algorithm.
出处 《计算机工程》 CAS CSCD 北大核心 2011年第20期180-182,共3页 Computer Engineering
基金 国家自然科学基金资助项目(61073043 60873037 61073041)
关键词 数据流 选择性集成 分类 自适应 特征空间 data stream selective integration classification self-adaption eigenspace
  • 相关文献

参考文献6

  • 1Domingos P, Hulten G. Mining High-speed Data Streams[C]//Proc. of the 6th International Conference on Knowledge Discovery and Data Mining. Boston, USA: [s. n.], 2000.
  • 2赵传申,何顺刚,杨吉宏,陈丽霞.基于多分类-关联规则的数据流分类算法[J].计算机工程,2010,36(9):38-40. 被引量:5
  • 3Wang Yong, Qu Wei. Mining Concept-drifting Data Stream Using Ensemble Classifiers[C]//Proc. of the 9th ACM International Conference on Knowledge Discovery and Data Mining. [S. l.]: ACM Press, 2003.
  • 4Street W N, Kim Y S. A Streaming Ensemble Algorithm for Large- scale Classification[C]//Proc. of the 7th ACM International Conference on Knowledge Discovery and Data Mining. [S. l.]: ACM Press, 2001.
  • 5张健沛,杨显飞,杨静.面向高速数据流的偏倚抽样集合分类器[J].北京邮电大学学报,2010,33(4):44-48. 被引量:2
  • 6Wu Jianxin, Zhou Zhihua. Ensembling Neural Networks: Many Could Be Better than All[J]. Artificial Intelligence, 2002, 137(1/2): 239-263.

二级参考文献18

  • 1赵传申,孙志挥.多分类-关联规则分类的一种改进算法[J].应用科学学报,2005,23(6):615-619. 被引量:1
  • 2Hulten G,Spencer L,Domingos P.Mining Time-changing Data Streams[C]//Proc.of 2001 ACM SIGKDD Int'l Conf.on Knowledge Discovery in Databases.San Francisco,CA,USA:ACM Press,2001.
  • 3Liu Bing,Hsu Wynne,Ma Yiming.Integrating Classification and Association Rule Mining[C]//Proc.of KDD'98.New York,USA:IEEE Press,1998.
  • 4Li Wenmin,Han Jiawei,Pei Jian.CMAR:Accurate and Efficient Classification Based on Multiple Class-association Rules[C]//Proc.of ICDM'01.San Jose,CA,USA:[s.n.],2001:369-376.
  • 5Blake C L,Merz C J.UCI Repository of Machine Learning Databases[Z].(1998-08-10).http://www.ics.uci.edu/-mlearn/ML Repository.html.
  • 6Coenen F.The LUCS-KDD Discretised/Normalised ARM and CARM Data Library[Z].(2003-10-12).http://www.csc.liv.ac.uk/-frans/KDD/Software/LUCS_KDD_DN/.
  • 7Domingos P,Hulten G.Mining High-speed Data Streams[C]//Proc.of 2000 ACM SIGKDD Int'l Conf.on Knowledge Discovery in Databases.Boston,MA,USA:ACM Press,2000:71-80.
  • 8Wang H, Fan W, Yu P S, et al. Mining concept-drifting data stream using ensemble classifiers[ C ] //Proceedings of the Ninth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. Washington, D. C : ACM, 2003 : 226-235.
  • 9Domingos P, Hulten G. Mining high-speed data streams [ C ]// Proceedings of the Sixth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. Boston: ACM, 2000: 71-80.
  • 10Street W N, Kim Y S. A streaming ensemble algorithm for large-scale classification [ C ]//Proceedings of the Seventh ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. San Francisco: ACM, 2001 : 377-382.

共引文献5

同被引文献18

  • 1Chu F, Zaniolo C. Fast and light boosting for adaptive mining of data streams. Proceedings of the 5th Pacific-Asia Conference on Knowledge Discovery and Data Mining, Sydney, Australia, 2004.
  • 2Kolter J Z, Maloof M A. Using additive expert ensemble to cope with concept drift. Proceedings of the 22nd International Cotd-rence on Machine Learning, Bonn, Germany, 2005.
  • 3Zhou Z H, Wu J X, Tang W. Ensembling neural networks: many could be better than all. Artificial Intelligence, 2002, 137(12): 239-263.
  • 4Bi Y X. The impact of diversity on the accuracy of evidential classifier ensembles. International Journal of Approximate Reasoning, 2012(53).
  • 5Chu R, Wang M, Zeng X Q, eta/. A new diverse measure in ensemble learning using unlabeled data. Proceedings of the 4th Int Corrf on Computational Intelligence, Communication Systems and Networks. Washington, DC, USA, 2012.
  • 6Li N, Yu Y, Zhou Z H. Diversity regularized ensemble pruning. Proceedings of the European Conf on Machine Learning and Prim:iples and Practice of Knowledge Discovery in Databases, Athens, Greece, 2012.
  • 7Sun S L , Zhang C S. Subspace ensembles for classification. Statistical Mechanics and Its Applications, 2007, 385(0:199-207.
  • 8Opitz D, Maclin R. Popular ensemble methods: an empirical study. Journal of Artificial Intelligence Research, 1999 (11).
  • 9Kuncheva L I, Whitaker C J. Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Machine Learning, 2003, 51(2): 181-207.
  • 10Gupta L, Kota S, Molfese D L. Diversity-based selection of components for fusion classifiers. Proceedings of the 32nd Annual International Conference of the IEEE EMBS, Buenos Aires, Argentina, 2010:6304-6307.

引证文献1

二级引证文献17

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部