期刊文献+

基于熵变的DAG-SVMs的组合策略 被引量:2

Composing strategy of entropy-based DAG-SVMs
下载PDF
导出
摘要 为了降低传统的有向无环图支持向量机(DAG-SVMs)多类分类方法在模型构建过程中节点选择的随机性,提高最终分类结果的准确率,提出了一种基于熵变的有向无环图支持向量机(E-DAG-SVMs)的组合策略。该策略通过计算各支持向量机在分割样本集时引发的熵变,依据信息增益最大化原则来决定节点的选择,进而构建多类分类模型。实验结果表明了该策略的有效性。 To reduce the node choice randomicity of traditional directed acyclic graph support vector machine(DAG-SVMs) multiclassification model,improve the final classification accuracy,a composing strategy of the entropy-based DAG-SVMs(E-DAG-SVMs) is proposed.This strategy calculates the change of entropy,when each support vector machines splitting the sample set,determines the choice of node based on the principle of maximization of information gain,constructs the multi-classification model.By the comparison experiment to the other methods mentioned,the validity of this strategy is demonstrated.
出处 《计算机工程与设计》 CSCD 北大核心 2010年第4期832-835,共4页 Computer Engineering and Design
关键词 支持向量机 有向无环图 信息增益 多类分类 SVM DAG information gain entropy multi-classification
  • 相关文献

参考文献15

  • 1Vapnik V.The nature of statistical learning[M].New York:Spring Verlag, 1999.
  • 2Songsiri P, Kijsirikul B,Phetkaew T.Inforrnation-based dichotomization:A method for multiclass support vector machines[C]. IEEE International Joint Conference on Neural Networks,2008: 3284-3291.
  • 3Mehta S,Lingayat N.Combined entropy based method for detection of QRS complexes in 12-lead electrocardiogram using SVM [J]. Computers in Biology and Medicine, 2008,38 (1): 138-145.
  • 4Magimai-Doss M,Hakkani-Tur D,Cetin O,et al.Entropy based classifier combination for sentence segmentation[C].IEEE International Conference on Acoustics, Speech and Signal Processing-Proceedings,2007.
  • 5Chakrabartty S,Cauwenberghs G.Gini support vector machine: Quadratic entropy based robust multi-class probability regression [J]. Journal of Machine Learning Research, 2007 (8): 813-839.
  • 6Costache M, Lienou M, Datcu M.On Bayesian inference,maximum entropy and support vector machines methods[J].AIP Conference Proceedings,2006,872(1):43-51.
  • 7Fei B,Liu J.Binary tree of SVM:A new fast multiclass training and classification algorithm [J]. IEEE Transactions on Neural Networks,2006,17(3):696-704.
  • 8Duan K, Keerthi S.Which is the best multiclass SVM method? An empirical study [C]. Lecture Notes in Computer Science, 2005:278-285.
  • 9Shih F, Zhang K. Support vector machine networks for multiclass classification[J].International Journal of Pattern Recognition and Artificial Intelligence,2005,19(6):775-786.
  • 10Hsu C,Lin C.A Comparison of methods for multi-class support vector machines [J]. IEEE Transaction on Neural Network, 2002,13 (2):415-425.

二级参考文献45

共引文献80

同被引文献21

  • 1成洁,石跃祥,易璨.FSVM在图像低层特征与高层语义关联中的应用[J].小型微型计算机系统,2007,28(6):1119-1122. 被引量:4
  • 2Zhang Lei, Ma Jun. Image annotation by incorporating word correlations into multi-class SVM [ C ]. Proc. IEEE,2011:917 - 927.
  • 3Inoue T, Abe S. Fuzzy support vector machines for pattern classification[ C ]. Proceedings Of International Joint Conference on Neural Networks ( IJCNN' 01 ), July ,2001,2 : 1449 - 1454.
  • 4Li Jianming ,Huang Shuguang, HE Rong-sheng, et al. Content-based t-based semantic indexing of image u- sing fuzzy supp-ort vector machines [ C ]. Proc. IEEE, 2008 : 1 - 6.
  • 5Li Jianming, Huang Shuguang,HE Rong-sheng,et al. Image classification based on fuzzy support vector ma- chine [ C ]. Proc. IEEE,2008,51:68 - 71.
  • 6胡良谋,曹克强,徐浩军,等.支持向量机故障诊断及控制技术[M].北京:国防工业出版社,2011.
  • 7杨志民,刘广利.不确定性支持向量机--算法及应用[M].北京:科学出版社,2012.
  • 8Montanes E, Barranquero J, Diez J, et al. Enhancing directed binary trees for multi-class classification[J]. Information Sciences, 2013, 223:42.
  • 9Kim K J, Ahn H. A corporate credit rating model u- sing muti-class support vector machines with an ordi- nal pairwise partitioning approach[J]. Comput Oper- at Res, 2012, 39:1800.
  • 10Kocev D, Vens C, J Struy J, et al. Tree ensembles for predicting structured outputs[J]. Pattern Recog- nition, 2013, 461817.

引证文献2

二级引证文献9

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部