期刊文献+

基于粗糙集与属性值聚类的决策树改进算法 被引量:3

Algorithm of decision trees based on rough set and clustering attribute’s values.
下载PDF
导出
摘要 采用粗糙集理论和属性值聚类相结合的方法,从决策树最优化的三个原则对其进行优化。首先,采用粗糙集理论的约简功能求出相对核,并利用信息熵作为启发信息求出相对约简,以此来保证生成决策树的路径最短和减少决策树的节点数。其次,在选择特征属性时,在信息熵增益最大的前提下,根据属性值间的相异性距离来对属性值聚类使其能够接近单峰分布。通过对UCI数据实验分析,结果表明很大程度上减少了决策树的节点数和决策树的深度。 The paper puts forward the way which includes the rough set theory and the cluster of attribute's values by optimizing decision tree from three principles.First,the relative core and relative reduction based on information entropy is worked out by rough set theory,which decreasing the decision tree's nodes in number and the decision tree's path in depth.Second,when the characteristic attributes that the information entropy is most gained are selected,by clustering attribute's values,the curve shows the peak of distribution,or approximately.By analyzing the data of UCI database,the results show that the algorithm greatly decreases decision tree's nodes in number and the depth of the paths.
出处 《计算机工程与应用》 CSCD 北大核心 2007年第31期178-181,共4页 Computer Engineering and Applications
关键词 粗糙集理论 决策树 属性约简 ID3算法 信息熵 Rough Set decision, tree attribute reduction ID3 algorithm information entropy
  • 相关文献

参考文献13

二级参考文献26

  • 1洪家荣,丁明峰,李星原,王丽薇.一种新的决策树归纳学习算法[J].计算机学报,1995,18(6):470-474. 被引量:92
  • 2谢志鹏.基于粗糙集合-最近邻的特征选取.99青岛-香港国际计算机会议论文集[M].青岛:青岛出版社,1999.935-938.
  • 3[1]Blummer A, Ehrenfeucht A, Haussler D, Warmuth MK. Occam's Razor. Information Processing Letters, 1987,24:377~380.
  • 4[2]Murphy PM, Pazzani MJ. Exploring the decision forest. In: Proceedings of the Computational Learning and Natural Learning Workshop. Provincetown, MA, 1993. 10~12.
  • 5[3]Qualian JR. Induction of decision trees. Machine Learning, 1986,1:81~106.
  • 6[4]Qualian JR. C4.5: Programs for Machine Learning. San Mateo, CA: Morgan Kaufmann Publishers, 1993.
  • 7[5]Freund Y, Schapire RE. Experiments with a new boosting algorithm. In: Proceedings of the 13th International Conference on Machine Learning. San Francisco: Morgan Kaufmann Pulishers, 1996. 148~156.
  • 8[6]Breiman L. Bagging predictors. Machine Learning, 1996,24:123~140.
  • 9[7]Qualian JR. Bagging, boosting, and C4.5. In: Proceedings of the 13th National Conference Artificial Intelligence. Portland, Ore., 1996. 725~730.
  • 10[8]Murthy S, Kasif S, Salzberg S. A system for induction of oblique decision trees. Journal of Artificial Intelligence Research, 1994,2: 1~32.

共引文献232

同被引文献22

引证文献3

二级引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部