期刊文献+

一种基于影响度的可伸缩的决策树算法

A Retractile Decision Tree Algorithm Based on Influence Degree
下载PDF
导出
摘要 目的针对ID3算法计算复杂度高这一问题,改进决策树生成算法DTA(Decision Tree Algorithm).方法提出了用影响度作为属性选择的标准,为了使算法具有良好的可伸缩性,引入了基于类别的属性表的新的数据结构.结果表明算法能生成正确的决策树,并且计算复杂度明显优于传统算法.结论可以在计算机硬件配置较低、资源消耗较少的条件下来快速生成正确的决策树,得到相应的决策规则. Objective Aiming at high complex degree of the algorithm ID3, to introduce an advanced algorithm, DTA (Decision Tree Algorithm), which is based on decision tree. Methods Using influence degree as the standard of feature selection for a good calculation complex degree and using a new data structure which is an attribute list based on classification for a good retractility. Results As is shown from the result of experiment, the advanced algorithm can make accurate decision tree and better complex degree than traditional algorithms. Conclusion The algorithm can generate correct decision trees and obtain relevant decision rules on the conditions of low configuration of computer and less consumption of resourses.
出处 《河北北方学院学报(自然科学版)》 2008年第4期55-57,61,共4页 Journal of Hebei North University:Natural Science Edition
基金 河北省科技研究与发展指导项目(07213543)
关键词 决策树 数据挖掘 算法 属性表 影响度 decision tree data mining algorithm attribute list influence degree
  • 相关文献

参考文献5

  • 1[1]Breslow L A,Aha D W.Simplifying decision trees:a survey[J].Knowl Engin Rev,1997,12(01):1-40
  • 2[2]Su R R,Hwang I S.Dynamic wavelength assignment mechanism using decision tree ID3/C4.5 algorithm on WDM ring access networks[J].IEEE J Select Areas Sens Control,2004,2(03):1 057-1 062
  • 3[3]Ruggieri S.Efficient C4.5(Classification Algorithm)[J].IEEE Transact Knowl Data Engin,2002,14(02):438-444
  • 4戴南,吉根林.分布式决策树算法研究与实现[J].南京师范大学学报(工程技术版),2005,5(4):46-48. 被引量:3
  • 5郭超峰,李梅莲.基于ID3算法的决策树研究与应用[J].许昌学院学报,2007,26(2):107-111. 被引量:10

二级参考文献11

  • 1洪家荣,丁明峰,李星原,王丽薇.一种新的决策树归纳学习算法[J].计算机学报,1995,18(6):470-474. 被引量:92
  • 2张敏灵,陈兆乾,周志华.分布式数据挖掘综述[J].计算机科学,2002,29(9):424-429.
  • 3[2]Quinlan J R. Induction of decision trees [J]. Machine Learning, 1986(1): 81-106.
  • 4[3]Caragea D, Silvescu A, Honavar V. Decision tree induction from distributed heterogeneous autonomous data sources [C]// Proceedings of the Conference on intelligent Systems Design and Applications. ISDA, 2003.
  • 5[4]Manish Mehta, Rakesh Agrawal, Jorma Rissanen. SLIQ: A fast scalable classifier for data mining [C]// EDBT 96. France: Avignon, 1996.
  • 6[5]John Shafer, Rakesh Agrawal, Manish Mehta. SPRINT: A scalable parallel classifier for data mining [C]// Proc of the VLDB Conference. India: Bombay, 1996.
  • 7Michalski R S,Mozetic I,Hong J R.The multi-purpose incremental learning system AQ15 and its testing application to three medical domains[J].In Proc.AAAI,USA,1986,1041-1046.
  • 8Quinlan J R.Discovering rules from large collections of examples:A case study[M],In:Michie,D.,editor,Expert Systems in the Microelectronic Age,Edinburgh University Press,Scotland,1979.
  • 9Quinlan J R.Induction of decision trees[J].Machine Learning,1986,1(1):81-106.
  • 10Tu Pei-Lei,Chung Jen-Yao.A new decision-tree classification algorithm for machine learning[C],In:Proceedings of the 1992 IEEE International Conference on Tools for Artificial Intelligence,Arlington Virginia.USA:IEEE Computer Society,1992,370-377.

共引文献11

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部