期刊文献+

加权最大夹角间隔核心集向量机的不平衡数据分类

The weighted maximum vector angular margin core vector machine for imbalanced data classification
原文传递
导出
摘要 为了处理大规模数据和不平衡数据分类问题,提出了一种新的分类方法,利用基于最大夹角间隔的核心集向量机算法实现对大样本数据的分类;针对不平衡数据分类问题,通过对不同的样本给予不同的权重,来提高算法的分类性能。加权最大夹角间隔核心集向量机方法不仅能够有效地解决不平衡数据的分类问题,而且能够实现对大样本数据的快速训练。 A new classification approach was proposed in order to deal with large size and imbalanced datasets classification problem.The proposed method was based on the maximum vector angular margin core vector machine to implement classification for large datasets.For the imbalanced datasets classification problem, each sample was assigned by different weights, which could improve the classification performance of algorithm.The proposed approach could effectively solve the imbalanced datasets classification problem and implement fast training on large datasets.
出处 《山东大学学报(工学版)》 CAS 北大核心 2014年第3期1-7,共7页 Journal of Shandong University(Engineering Science)
基金 国家自然科学基金资助项目(61170040) 河北省自然科学基金资助项目(F2011201063)
关键词 最大夹角间隔 核心集 核心集向量机 最小包络球 不平衡数据 权重 the maximum vector angular margin the core set core vector machine the minimum enclosing ball im-balanced data weight
  • 相关文献

参考文献25

  • 1CHAWLA N V, BOWYER K W, HALL L O, et al. Synthetic minority over-sampling technique[J]. Journal of Artificial Intelligence Research, 2002, 16 ( 3 ) : 321- 357.
  • 2KUBAT M, MATWIN S. Addressing the curse of imbal-anced datasets [C]//One-sided Sampling Proceedings of the Fourteenth International Conference on Machine Learning. Nashville, Tennessee, USA: IEEE, 1997:178- 186.
  • 3SUN Y M, KAMEL M S, WONG K C. Cost-sensitive boosting for classification of imbalanced data[J]. Pattern Recognition, 2007, 40 (2) : 3358-3378.
  • 4SUN Y M, WONG K C, KAMEL M S. Classification of imbalanced data: a review [ J ]. International Journal of Pattern Recognition and Artificial Intelligence, 2009, 23 (4) : 687-719.
  • 5TAO Q, WU G W, WANG F Y, et al. Posterior proba- bility support vector machines for unbalanced data [J]. IEEE Trans on Neural Networks, 2005, 16 ( 6 ) : 1561- 1573.
  • 6ZONG Weiwei, HUANG Guangbin, CHEN yiqiang. Weighted extreme learning machine for imbalance learning[J]. Neurocomputing, 2013, 101:229-242.
  • 7TANG Y C, ZHANG Y Q, CHAWLA N V, et al. SVMs modeling for highly imbalanced classification [J]. IEEE Transactions on Systems Man and Cybernetics Part B-Cybernetics, 2009, 39(1) :281-288.
  • 8HE H, GARCIA E A. Learning from imbalanced data [ J]. IEEE Trans Knowl Data Eng, 2009, 21 (9) : 1263- 1284.
  • 9VAPNIK V N. Statistical learning theory [M ]. New York: Wiley, 1998.
  • 10TAX D M J, DUIN R P W. Support vector data descrip- tion[J]. Machine Learning, 2004, 54( 1 ):45-66.

二级参考文献22

  • 1Cortes C, Vapnik V. Support vector networks [J]. Machine Learning, 1995,20(3 ) : 273 - 297.
  • 2Scholkopf B, Smola A, Williamson RC, Bartlett PL. New support vector algorithms[ J ]. Neural Computation, 2000, 12:1207 - 1245.
  • 3Hu M, Chen Y, Kwok J T. Building sparse malti-kemel SVM classifiers[J].IEEE Trans on Neural Networks, 2009,20(5) : 827 - 839.
  • 4Chung FL,Wang S T,Deng Z.H. ,et al. Fuzzy kernel hyperball perceptron[J]. ASC, 2004,5 : 67 - 74.
  • 5Shivaswamy P, Jebara T. Ellipsoidal kemel machines[J].Artificial Intelligence and Statistics, AISTATS, 2007.
  • 6David M J Tax, et al. Support vector data description [ J ]. Machine Learning,2004,54(1):45 - 66.
  • 7Wu M. R, Ye J P. A small sphere and large margin approach for novelty detection using training data with outliers[J]. PAMI,2009,31 : 1 - 5.
  • 8Collobert R, Bengio S., Bengio Y. A parallel mixture of SVMs for very large scale problems[ J]. Neural Computation,2002,14 (5) : 1105 - 1114.
  • 9Tsang I W, Kwok J T, et al. Core vector machines: Fast SVM training on very large data sets [ J ]. Journal of Machine Learning Research, 2005,6: 363 - 392.
  • 10Williams C, Seeger M. Using the Nystron method to speed up kernel machines[ A]. Advances in Neural Information Processing Systems[C]. Cambridge,MA:MIT Press,2001.13:682 - 688.

共引文献7

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部