期刊文献+

基于Boosting的TAN组合分类器 被引量:14

Boosting-Based TAN Combination Classifier
下载PDF
导出
摘要 Boosting是一种有效的分类器组合方法 ,它能够提高不稳定学习算法的分类性能 ,但对稳定的学习算法效果不明显 TAN(tree augmentedna veBayes)是一种树状结构的贝叶斯网络 ,标准的TAN学习算法生成的TAN分类器是稳定的 ,用Boosting难以提高其分类性能 提出一种构造TAN的新算法GTAN ,并将由GTAN生成的多个TAN分类器用组合方法BoostingMultiTAN组合 ,最后实验比较了TAN组合分类器与标准的TAN分类器 实验结果表明 ,在大多数实验数据上 ,Boosting Boosting is an effective classifier combination method, which can improve classification performance of an unstable learning algorithm. But it does not make much more improvement of a stable learning algorithm. TAN, tree-augmented nave Bayes, is a tree-like Bayesian network. The standard TAN learning algorithm generates a stable TAN classifier, whose accuracy is difficult to improve by the Boosting technique. In this paper, a new TAN learning algorithm called GTAN is presented, and multiple TAN classifiers generated by GTAN are combined by a combination method called Boosting-MultiTAN. Finally, this TAN combination classifier is compared with the standard TAN classifier by the experiments. Experimental results show that the Boosting MultiTAN has higher classification accuracy than the standard TAN classifier on most data sets.
出处 《计算机研究与发展》 EI CSCD 北大核心 2004年第2期340-345,共6页 Journal of Computer Research and Development
基金 国家"十五"科技攻关计划重点基金项目 ( 2 0 0 2BA40 7B)
关键词 BOOSTING 组合方法 TAN 依赖关系 Boosting combination method TAN dependence relation
  • 相关文献

参考文献14

  • 1R E Schapire. The strength of weak learnability. Machine Learning, 1990, 5(2): 197~227
  • 2R E Schapire, Y Freund, P Bartlett et al. Boosting the margin: A new explanation for the effectiveness of voting methods. In: Douglas H Fisher eds. Proc of the 14th Int'l Conf on Machine Learning. San Francisco: Morgan Kaufmann, 1997. 322~330
  • 3Y Freund, R E Schapire. Experiments with a new Boosting algorithm. In: Lorenza Saitta ed. Proc of the 13th Int'l Conf on Machine Learning. San Francisco: Morgan Kaufmann, 1996. 148~156
  • 4Y Freund. Boosting a weak learning algorithm by majority. Information and Computation, 1995, 121(2): 256~285
  • 5Y Freund. An adaptive version of the Boost by majority algorithm. In: Shai Ben-David, Phil Long eds. Proc of the 12th Annual Conf on Computational Learning Theory. New York: ACM Press, 1999. 102~113
  • 6J R Quinlan. Bagging, Boosting, and C4.5. In: Ben-Eliyahu, Rachel eds. Proc of the 13th National Conf on Artificial Intelligence. Menlo Park, CA: AAAI Press, 1996. 725~730
  • 7E Bauer, R Kohavi. An empirical comparison of voting classification algorithms: Bagging, boosting, and variants. Machine Learning, 1999, 36(1/2): 105~139
  • 8K M Ting, Z Zheng. Improving the performance of boosting for naive Bayesian classification. In: Ning Zhong, Lizhu Zhou eds. Proc of the 3rd Pacific-Asia Conf on Knowledge Discovery and Data Mining. Berlin Germany: Springer-Verlag, 1999. 296~305
  • 9Z Zheng. Nave Bayesian classifier committees. In: Chaire Nedellec, Celine Rouveirol eds. Proc of the 10th European Conf on Machine Learning. Chemnitz, Berlin Germany: Springer-Verlag, 1998. 196~207
  • 10N Friedman, D Geiger, M Goldszmidt. Bayesian network classifiers. Machine Learning, 1997, 29(2/3): 131~163

同被引文献124

引证文献14

二级引证文献71

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部