摘要
提出了一种特征加权的核学习方法,其主要为了解决当前核方法在分类任务中对所有数据特征的同等对待的不足。在分类任务中,数据样本的每个特征所起的作用并不是相同的,有些特征对分类任务有促进作用,应该给予更多的关注。提出的算法集成了多核学习的优势,以加权的方式组合不同的核函数,但所需的计算复杂度更低。实验结果证明,提出的算法与支持向量机、多核学习算法相比,分类准确度优于支持向量机和多核学习算法,在计算复杂度上略高于支持向量机,但远远低于多核学习算法。
A kernel learning algorithm is proposed based on feature weight. As in the classification task, each feature of data sample is not taken the same contribution, some features are very related to the classification task, it should be given more attention. The proposed algorithm integrates the advantages of multiple kernels learning, which is a weighted combination of different kernel functions, but at a lower computational complexity required. Experimental results show that the proposed algorithm, compared to the multiple kernel learning and support vector machines, obtains the state of art results,and its computational complexity is slightly higher than support vector machines, but with far less than multiple kernels learning algorithm.
出处
《计算机工程与应用》
CSCD
北大核心
2015年第14期104-107,119,共5页
Computer Engineering and Applications
关键词
特征加权
支持向量机
核学习
feature weight learning
support vector machine
kernel learning