期刊文献+

双重特征加权模糊支持向量机 被引量:4

Doubly feature-weighted fuzzy support vector machine
下载PDF
导出
摘要 针对当前基于特征加权的模糊支持向量机(FSVM)只考虑特征权重对隶属度函数的影响,而没有考虑在样本训练过程中将特征权重应用到核函数计算中的缺陷,提出了同时考虑特征加权对隶属度函数和核函数计算的影响的模糊支持向量机算法——双重特征加权模糊支持向量机(DFW-FSVM)。首先,利用信息增益(IG)计算出每个特征的权重;然后,在原始空间中基于特征权重计算出样本到类中心的加权欧氏距离,进而应用该加权欧氏距离构造隶属度函数,并在样本训练过程中将特征权重应用到核函数的计算中;最后,根据加权的隶属度函数和核函数构造出DFW-FSVM算法。该方法避免了在计算过程中被弱相关或不相关的特征所支配。在8个UCI数据集上进行对比实验,结果显示DFW-FSVM算法的准确率和F1值较5个对比算法(SVM、FSVM、特征加权SVM(FWSVM)、特征加权FSVM(FWFSVM)、基于中心核对齐的FSVM(CKA-FSVM))中的最好结果分别提升了2.33和5.07个百分点,具有较好的分类性能。 Concerning the shortcoming that the current feature-weighted Fuzzy Support Vector Machines(FSVM)only consider the influence of feature weights on the membership functions but ignore the application of feature weights to the kernel functions calculation during sample training,a new FSVM algorithm that considers the influence of feature weights on the membership function and the kernel function calculation simultaneously was proposed,namely Doubly Feature-Weighted FSVM(DFW-FSVM).Firstly,relative weight of each feature was calculated by using Information Gain(IG).Secondly,the weighted Euclidean distance between the sample and the class center was calculated in the original space based on the feature weights,and then the membership function was constructed by applying the weighted Euclidean distance;at the same time,the feature weights were applied to the calculation of the kernel function in the sample training process.Finally,DFWFSVM algorithm was constructed according to the weighted membership functions and kernel functions.In this way,DFWFSVM is able to avoid being dominated by trivial relevant or irrelevant features.The comparative experiments were carried out on eight UCI datasets,and the results show that compared with the best results of SVM,FSVM,Feature-Weighted SVM(FWSVM),Feature-Weighted FSVM(FWFSVM)and FSVM based on Centered Kernel Alignment(CKA-FSVM),the accuracy and F1 value of the DFW-FSVM algorithm increase by 2.33 and 5.07 percentage points,respectively,indicating that the proposed DFW-FSVM has good classification performance.
作者 邱云志 汪廷华 戴小路 QIU Yunzhi;WANG Tinghua;DAI Xiaolu(School of Mathematics and Computer Science,Gannan Normal University,Ganzhou Jiangxi 341000,China)
出处 《计算机应用》 CSCD 北大核心 2022年第3期683-687,共5页 journal of Computer Applications
基金 国家自然科学基金资助项目(61966002)。
关键词 模糊支持向量机 特征加权 信息增益 核函数 隶属度函数 Fuzzy Support Vector Machine(FSVM) feature-weighted Information Gain(IG) kernel function membership function
  • 相关文献

参考文献2

二级参考文献17

  • 1赵晖,荣莉莉.支持向量机组合分类及其在文本分类中的应用[J].小型微型计算机系统,2005,26(10):1816-1820. 被引量:7
  • 2李洁,高新波,焦李成.基于特征加权的模糊聚类新算法[J].电子学报,2006,34(1):89-92. 被引量:114
  • 3张翔,肖小玲,徐光祐.基于样本之间紧密度的模糊支持向量机方法[J].软件学报,2006,17(5):951-958. 被引量:84
  • 4Vapnik V. The Nature of Statistical Learning Theory. New York: SpringerVerlag, 1995: 91-188.
  • 5Cristianini N and Shawe-Taylor J. An Introduction to Support Vector Machines and Other Kernel-Based Learning Methods. Cambridge: Cambridge University Press, 2000: 47-98.
  • 6Lin C F and Wang S D. Fuzzy support vector machines. IEEE Trans. on Neural Networks, 2002, 13(2): 464-471.
  • 7Zhan Yan, Chen Hao, and Hang Guochun. An optimization algorithm of K-NN classifier. Proceedings of the Fifth International Conference on Machine Learning and Cybernetics, Dalian, China, 2006: 2246-2251.
  • 8Wang Xizhao, Wang Yadong, and Wang Lijuan. Improving fuzzy c-means clustering based on feature-weight learning. Pattern Recognition Letters, 2004, 25(10): 1123-1132.
  • 9Quinlan J R. Induction of decision tree. Machine Learning, 1986, 1(1): 81-106.
  • 10Han Jiawei and Kamber M. Data Mining: Concepts and Techniques: Second Edition. Beijing: China Machine Press, 2006: 296-300.

共引文献61

同被引文献30

引证文献4

二级引证文献4

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部