期刊文献+

一种改进的ID3算法 被引量:1

An Improved ID3 Algorithm
下载PDF
导出
摘要 决策树是数据挖掘的一种重要方法,通常用来形成分类器和预测模型。ID3算法作为决策树的核心算法,由于它的简单与高效而得到了广泛的应用,然而它倾向于选择属性值较多的属性作为分支属性,从而可能错过分类能力强的属性。对ID3算法的分支策略进行改进,增加了对属性的类区分度的考量。经实验比较,新方法能提高决策树的精度,简化决策树。 Desion tree,which is used to classify samples, is one of the important models in data mining. As the core algorithm of decision tree,the classical ID3 algorithm is being widely used in classification problems by its simplicity and efficiency. Unfortunately, it is prone to making the attribute which contains more values as decision attribute, so the attribute which has strong classification ability are probably missed. Proposes an improved ID3 algorithm. When the most information gains of the attributions are same, the algorithm helps us to select an attribute which can get better classification effect. Compared to the classical ID3 algorithm, the new one can reduce misclassification rate and simplify the complexity of the decision tree.
作者 庄卿卿
机构地区 广东工业大学
出处 《现代计算机》 2009年第5期43-46,共4页 Modern Computer
关键词 决策树 属性 属性的类区分度 Decision Tree Attribute Full Division Class Number
  • 相关文献

参考文献9

  • 1Quilan, J.R.Induction of Decision Tree[J]. Machine Learning,1986,(1):81-106
  • 2F.Rosenbaltt. The Perceptron, a Perceiving and Recognizing Automaton[J]. Psychological Review, Nov 1958,65(6):386- 408
  • 3Vapnik,V.N. The Nature of Statistical Learning Theory[M]. New York:Springer,1995
  • 4S.Esmeir,S.Markovitch. Anytime Induction of Low-cost, Low-error Classifiers: a Sampling-based Approach[J]. Journal of Artificial Intelligence Research. 2008,33:1-31
  • 5Quinlan,J.R.C4.5: Programs for Machine Learning[M]. San Fransisco: Morgan Kaufmann Publishers,1993
  • 6杨宏伟,赵明华,孙娟,王熙照.基于层次分解的决策树[J].计算机工程与应用,2003,39(23):108-110. 被引量:12
  • 7龙舜,钟衍凡,蔡建华,王会进.ABLE中的决策树算法的模糊因子改进方法[J].暨南大学学报(自然科学与医学版),2008,29(1):39-42. 被引量:1
  • 8UCI Repository(DB/OL). http://archive.ics.uci.edu/ml/.
  • 919MclassTextWc(DB/OL). http://www.cs.waikato.ac.nz/ml/ weka/index_datasets.html

二级参考文献12

  • 1Quinlan J R.Induction of Decision Trees[J].Machine Learning,1986; (1):81~106.
  • 2Tom M Mitchell,MACHINE LEARNING,International Edition.
  • 3MITCHELL T. Machine Learning [ M ]. New York: McGraw - Hill, 1997.
  • 4Agent Building and Learning Environment [ EB/OL ]. http ://www. alphaworks. ibm. com/tech/able.
  • 5BIGUS J. Constructing Intelligent Agents Using Java [ M]. Second Edition. John Wiley & Sons, Ine, 2001.
  • 6UCI Machine Learning Repository [ EB/OL]. http://www. ics. uci. edu/- mlearn/MLRepository. html.
  • 7QUINLAN J R. Induction of decision trees[ J]. Machine Learning, 1(1) : 81 -106.
  • 8QUINLANJ R. CA. 5: Programs for machine learning [ M ]. San Mateo : Morgan Kanfmann Publishers, Inc, 1993.
  • 9BUNTINE W, UTGOFF P. A further comparison of splitting hales for decision tree induction[ J]. Machine Learning, 8 ( 1 ) : 75 - 86.
  • 10MINGERS J. An empirical comparison of selection measures for decision tree induction[ J]. Machine Learning, 3 (4) : 319 -342.

共引文献11

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部