期刊文献+

基于改进C4.5算法的税收信用分类应用研究

Application of Taxation credit classification Based on Improved C4.5 Algorithm
下载PDF
导出
摘要 税收信用分类管理在税务系统中起着重要作用,应用分类算法解决税收信用等级手工评定问题是当前税务系统的难题之一。决策树算法是分类算法中一类重要算法,其中以C4.5算法最为经典,但该算法在连续属性离散化方面花费时间成本较多。该文在C4.5连续属性离散化算法基础上引入基于经验值的窗口分割技术,在保证生成决策树准确率的前提下,有效的提高了算法运行效率。应用改进算法构造税收信用等级判定决策树,并根据构造的决策树实现对纳税人税收信用等级的自动判决。 Taxation credit classification management plays an import role in tax system, and apply classify algorithm to solve the problem of taxation credit classification by manual operation is one of the difficult points for today's taxation system . The Decision Tree Algorithm, especially C4.5, is an important kind of classification algorithm, However, C4.5 doesn't do a very good job in continuous attributes discretization. This treatise introduced window split technology based on empirical value into C4.5 continuous attributes discretization algorithm, effectively improved the effectiveness of algorithm and guaranteed the accuracy of built decision trees at the same time. It also built a taxation credit grade judgment decision tree with improved algorithm and automatic judging the taxation credit grade by the generated decision tree.
作者 徐邵兵
出处 《微计算机信息》 2009年第15期264-266,共3页 Control & Automation
基金 基金申请人:胡小建教授 项目名称:基于网格的开放式决策支持方法与决策支持系统的研究 颁发部门:安徽省自然科学基金 安徽省自然科学基金委 合肥工业大学(070416241)
关键词 决策树 C4.5算法 税收信用分类 经验值窗口分割 Decision Tree C4.5 algorithm taxation credit classification empirical value window split
  • 相关文献

参考文献7

二级参考文献16

  • 1张德政,阿孜古丽,冯洪海,杨炳儒.基于支持向量机挖掘不一致事例隐含的异常信息[J].北京科技大学学报,2004,26(5):564-568. 被引量:3
  • 2刘东辉,王树明,张庆生.基于数据挖掘的计算机动态取证系统[J].微计算机信息,2005,21(11X):82-84. 被引量:14
  • 3[1]Blummer A, Ehrenfeucht A, Haussler D, Warmuth MK. Occam's Razor. Information Processing Letters, 1987,24:377~380.
  • 4[2]Murphy PM, Pazzani MJ. Exploring the decision forest. In: Proceedings of the Computational Learning and Natural Learning Workshop. Provincetown, MA, 1993. 10~12.
  • 5[3]Qualian JR. Induction of decision trees. Machine Learning, 1986,1:81~106.
  • 6[4]Qualian JR. C4.5: Programs for Machine Learning. San Mateo, CA: Morgan Kaufmann Publishers, 1993.
  • 7[5]Freund Y, Schapire RE. Experiments with a new boosting algorithm. In: Proceedings of the 13th International Conference on Machine Learning. San Francisco: Morgan Kaufmann Pulishers, 1996. 148~156.
  • 8[6]Breiman L. Bagging predictors. Machine Learning, 1996,24:123~140.
  • 9[7]Qualian JR. Bagging, boosting, and C4.5. In: Proceedings of the 13th National Conference Artificial Intelligence. Portland, Ore., 1996. 725~730.
  • 10[8]Murthy S, Kasif S, Salzberg S. A system for induction of oblique decision trees. Journal of Artificial Intelligence Research, 1994,2: 1~32.

共引文献20

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部