期刊文献+

特征加权的核学习方法 被引量:1

Combination multiple kernels with feature weight learning
下载PDF
导出
摘要 提出了一种特征加权的核学习方法,其主要为了解决当前核方法在分类任务中对所有数据特征的同等对待的不足。在分类任务中,数据样本的每个特征所起的作用并不是相同的,有些特征对分类任务有促进作用,应该给予更多的关注。提出的算法集成了多核学习的优势,以加权的方式组合不同的核函数,但所需的计算复杂度更低。实验结果证明,提出的算法与支持向量机、多核学习算法相比,分类准确度优于支持向量机和多核学习算法,在计算复杂度上略高于支持向量机,但远远低于多核学习算法。 A kernel learning algorithm is proposed based on feature weight. As in the classification task, each feature of data sample is not taken the same contribution, some features are very related to the classification task, it should be given more attention. The proposed algorithm integrates the advantages of multiple kernels learning, which is a weighted combination of different kernel functions, but at a lower computational complexity required. Experimental results show that the proposed algorithm, compared to the multiple kernel learning and support vector machines, obtains the state of art results,and its computational complexity is slightly higher than support vector machines, but with far less than multiple kernels learning algorithm.
出处 《计算机工程与应用》 CSCD 北大核心 2015年第14期104-107,119,共5页 Computer Engineering and Applications
关键词 特征加权 支持向量机 核学习 feature weight learning support vector machine kernel learning
  • 相关文献

参考文献18

  • 1Vapnik V.The nature of statistical learning theory[M].New York: Springer-Verlag, 1995 : 91-188.
  • 2Chapelle O,Vapnik V,Bousquet O,et al.Choosing multi- ple parameters for support vector machines[J].Machine Learning, 2002,46 ( 1/3 ) : 131-159.
  • 3汪洪桥,孙富春,蔡艳宁,陈宁,丁林阁.多核学习方法[J].自动化学报,2010,36(8):1037-1050. 被引量:156
  • 4Lanckriet G R G, Cristianini N,Bartlett P, et al.Learn- ing the kernel matrix with semidefinite programming[J]. The Journal of Machine Learning Research, 2004, 5: 27-72.
  • 5Sonnenburg S, Ratsch G, Schafer C, et al.Large scale multiple kernel leaming[J].The Journal of Machine Learn- ing Research, 2006,7: 1531-1565.
  • 6汪廷华,田盛丰,黄厚宽.特征加权支持向量机[J].电子与信息学报,2009,31(3):514-518. 被引量:56
  • 7Boser B E, Guyon I M, Vapnik V N.A training algo- rithm for optimal margin classifiers[C]//Proceedings of the 5th Annual Workshop on Computational Learning Theory, 1992 : 144-152.
  • 8Shawe-Taylor J, Cristianini N.Kernel methods for pat- tern analysis[M].[S.1.]: Cambridge University Press,2004.
  • 9Lampert C H.Kemel methods in computer vision[M].[S.1.]: New Publishers Inc,2009.
  • 10Schokopf B, Herbrich R, Smola A J.A generalized representer theorem[C]//Computational Learning Theory. Berlin Heidelberg: Springer, 2001 : 416-426.

二级参考文献16

  • 1赵晖,荣莉莉.支持向量机组合分类及其在文本分类中的应用[J].小型微型计算机系统,2005,26(10):1816-1820. 被引量:7
  • 2李洁,高新波,焦李成.基于特征加权的模糊聚类新算法[J].电子学报,2006,34(1):89-92. 被引量:114
  • 3张翔,肖小玲,徐光祐.基于样本之间紧密度的模糊支持向量机方法[J].软件学报,2006,17(5):951-958. 被引量:84
  • 4Zhou Yatong Zhang Taiyi Li Xiaohe.MULTI-SCALE GAUSSIAN PROCESSES MODEL[J].Journal of Electronics(China),2006,23(4):618-622. 被引量:4
  • 5Vapnik V. The Nature of Statistical Learning Theory. New York: SpringerVerlag, 1995: 91-188.
  • 6Cristianini N and Shawe-Taylor J. An Introduction to Support Vector Machines and Other Kernel-Based Learning Methods. Cambridge: Cambridge University Press, 2000: 47-98.
  • 7Lin C F and Wang S D. Fuzzy support vector machines. IEEE Trans. on Neural Networks, 2002, 13(2): 464-471.
  • 8Zhan Yan, Chen Hao, and Hang Guochun. An optimization algorithm of K-NN classifier. Proceedings of the Fifth International Conference on Machine Learning and Cybernetics, Dalian, China, 2006: 2246-2251.
  • 9Wang Xizhao, Wang Yadong, and Wang Lijuan. Improving fuzzy c-means clustering based on feature-weight learning. Pattern Recognition Letters, 2004, 25(10): 1123-1132.
  • 10Quinlan J R. Induction of decision tree. Machine Learning, 1986, 1(1): 81-106.

共引文献210

同被引文献3

引证文献1

二级引证文献7

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部