摘要
Support vector machines (SVMs) have widespread use in various classification problems. Although SVMs are often used as an off-the-shelf tool, there are still some important issues which require improvement such as feature rescaling. Standardization is the most commonly used feature rescaling method. However, standardization does not always improve classification accuracy. This paper describes two feature rescaling methods: multiple kernel learning-based rescaling (MKL-SVM) and kernel-target alignment-based rescaling (KTA-SVM). MKL-SVM makes use of the framework of multiple kernel learning (MKL) and KTA-SVM is built upon the concept of kernel alignment, which measures the similarity between kernels. The proposed meth- ods were compared with three other methods: an SVM method without rescaling, an SVM method with standardization, and SCADSVM. Test results demonstrate that different rescaling methods apply to different situations and that the proposed methods outperform the others in general.
Support vector machines (SVMs) have widespread use in various classification problems. Although SVMs are often used as an off-the-shelf tool, there are still some important issues which require improvement such as feature rescaling. Standardization is the most commonly used feature rescaling method. However, standardization does not always improve classification accuracy. This paper describes two feature rescaling methods: multiple kernel learning-based rescaling (MKL-SVM) and kernel-target alignment-based rescaling (KTA-SVM). MKL-SVM makes use of the framework of multiple kernel learning (MKL) and KTA-SVM is built upon the concept of kernel alignment, which measures the similarity between kernels. The proposed meth- ods were compared with three other methods: an SVM method without rescaling, an SVM method with standardization, and SCADSVM. Test results demonstrate that different rescaling methods apply to different situations and that the proposed methods outperform the others in general.
基金
Supported by the National Natural Science Foundation of China(Nos. 30625012 and 60721003)