期刊文献+

Double-layer Bayesian Classifier Ensembles Based on Frequent Itemsets 被引量:3

Double-layer Bayesian Classifier Ensembles Based on Frequent Itemsets
下载PDF
导出
摘要 Numerous models have been proposed to reduce the classification error of Naive Bayes by weakening its attribute independence assumption and some have demonstrated remarkable error performance. Considering that ensemble learning is an effective method of reducing the classifmation error of the classifier, this paper proposes a double-layer Bayesian classifier ensembles (DLBCE) algorithm based on frequent itemsets. DLBCE constructs a double-layer Bayesian classifier (DLBC) for each frequent itemset the new instance contained and finally ensembles all the classifiers by assigning different weight to different classifier according to the conditional mutual information. The experimental results show that the proposed algorithm outperforms other outstanding algorithms. Numerous models have been proposed to reduce the classification error of Naive Bayes by weakening its attribute independence assumption and some have demonstrated remarkable error performance. Considering that ensemble learning is an effective method of reducing the classifmation error of the classifier, this paper proposes a double-layer Bayesian classifier ensembles (DLBCE) algorithm based on frequent itemsets. DLBCE constructs a double-layer Bayesian classifier (DLBC) for each frequent itemset the new instance contained and finally ensembles all the classifiers by assigning different weight to different classifier according to the conditional mutual information. The experimental results show that the proposed algorithm outperforms other outstanding algorithms.
出处 《International Journal of Automation and computing》 EI 2012年第2期215-220,共6页 国际自动化与计算杂志(英文版)
基金 supported by National Natural Science Foundation of China (Nos. 61073133, 60973067, and 61175053) Fundamental Research Funds for the Central Universities of China(No. 2011ZD010)
关键词 Double-layer Bayesian CLASSIFIER frequent itemsets conditional mutual information support. Double-layer Bayesian, classifier, frequent itemsets, conditional mutual information, support.
  • 相关文献

参考文献4

二级参考文献61

  • 1Friedman N,Geiger D,Goldszmidt M.Bayesian network classifiers.Machine Learning,1997,29(2-3):131-163.
  • 2Langley P,Iba W,Thompson K.An analysis of Bayesian classifiers.In:Rosenbloom P,Szolovits P,eds.Proc.of the 10th National Conf.on Artificial Intelligence.Menlo Park:AAAI Press,1992.223-228.
  • 3Kononenko I.Seminaive Bayesian classifier.In:Kodratoff Y,ed.Proc.of the 6th European Working Session on Learning.New York:Springer-Verlag,1991.206-219.
  • 4Pazzani MJ.Searching for dependencies in Bayesian classifiers.In:Fisher D,Lenz HJ,eds.Learning from Data:Artificial Intelligence and Statistics V.New York:Springer-Verlag.1996.239-248.
  • 5Langley P,Sage S.Induction of selective Bayesian classifiers.In:Mantaras RL,Poole DL,eds.Proc.of the 10th Conf.on Uncertainty in Artificial Intelligence.San Francisco:Morgan Kaufmann Publishers,1994.399-406.
  • 6Webb GI,Pazzani MJ.Adjusted probability naive Bayesian induction.In:Antoniou G,Slaney JK,eds.Proc.of the 11th Australian Joint Conf.on Artificial Intelligence.Berlin:Springer-Verlag,1998.285-295.
  • 7Kohavi R.Scaling up the accuracy of Naive-Bayes classifiers:A decision-tree hybrid.In:Simoudis E,Han J,Fayyad UM,eds.Proc.of the 2nd Int'l Conf.on Knowledge Discovery and Data Mining.Menlo Park:AAAI Press,1996.202~207.
  • 8Keogh EJ,Pazzani MJ.Learning augmented Bayesian classifiers:A comparison of distribution-based and classification-based approaches.In:Heckerman DE,Whittaker J,eds.Proc.of the Uncertainty'99:The 7th Int'l Workshop on Artificial Intelligence and Statistics.
  • 9Cheng J,Greiner R.Comparing Bayesian network classifiers.In:Laskey KB,Prade H,eds.Proc.of the 15th Conf.on Uncertainty in Artificial Intelligence.San Francisco:Morgan Kaufmann Publishers,1999.101-108.
  • 10Chickering DM,Geiger D,Heckerman D.Learning Bayesian networks is NP-complete.In:Fisher DH,Lenz HJ,eds.Learning from Data:Artificial Intelligence and Statistics V.New York:Springer-Verlag,1996.121-130.

共引文献65

同被引文献24

  • 1I.F. Akyildiz,W. Su,Y. Sankarasubramaniam,E. Cayirci.Wireless sensor networks: a survey[J]. Computer Networks . 2002 (4)
  • 2Y. Zhang,N.A.S. Hamm,N. Meratnia,A. Stein,M. van de Voort,P.J.M. Havinga.Statistics-based outlier detection for wireless sensor networks[J]. International Journal of Geographical Information Science . 2012 (8)
  • 3Fei Tony Liu,Kai Ming Ting,Zhi-Hua Zhou.Isolation-Based Anomaly Detection[J]. ACM Transactions on Knowledge Discovery from Data (TKDD) . 2012 (1)
  • 4Miao Xie,Song Han,Biming Tian,Sazia Parvin.Anomaly detection in wireless sensor networks: A survey[J]. Journal of Network and Computer Applications . 2011 (4)
  • 5Jen-Yan Huang,I-En Liao,Yu-Fang Chung,Kuen-Tzung Chen.Shielding wireless sensor network using Markovian intrusion detection system with attack pattern mining[J]. Information Sciences . 2011
  • 6Varun Chandola,Arindam Banerjee,Vipin Kumar.Anomaly detection[J]. ACM Computing Surveys (CSUR) . 2009 (3)
  • 7Victoria J. Hodge,Jim Austin.A Survey of Outlier Detection Methodologies[J]. Artificial Intelligence Review . 2004 (2)
  • 8Markus M. Breunig,Hans-Peter Kriegel,Raymond T. Ng,J?rg Sander.LOF[J]. ACM SIGMOD Record . 2000 (2)
  • 9Chen Youshyang,Cheng Chinghsue. Hybrid Models Based on Rough Set Classifiers for Setting Credit Rating Decision Rules in the Global Banking Indus- try[J ]. Knowledge- based Systems, 2013,39 : 224 - 239.
  • 10Wang Huaqing, Chen Peng. Intelligent Diagnosis Method for Rolling Element Bearing Faults Using Possibility Theory and Neural Network[J]. Com- puter ~ Industrial Engineering, 2011,60 (4) :511- 518.

引证文献3

二级引证文献7

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部