期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
One-against-all-based Hellinger distance decision tree for multiclass imbalanced learning
1
作者 Minggang DONG Ming LIU Chao JING 《Frontiers of Information Technology & Electronic Engineering》 SCIE EI CSCD 2022年第2期278-290,共13页
Since traditional machine learning methods are sensitive to skewed distribution and do not consider the characteristics in multiclass imbalance problems,the skewed distribution of multiclass data poses a major challen... Since traditional machine learning methods are sensitive to skewed distribution and do not consider the characteristics in multiclass imbalance problems,the skewed distribution of multiclass data poses a major challenge to machine learning algorithms.To tackle such issues,we propose a new splitting criterion of the decision tree based on the one-against-all-based Hellinger distance(OAHD).Two crucial elements are included in OAHD.First,the one-against-all scheme is integrated into the process of computing the Hellinger distance in OAHD,thereby extending the Hellinger distance decision tree to cope with the multiclass imbalance problem.Second,for the multiclass imbalance problem,the distribution and the number of distinct classes are taken into account,and a modified Gini index is designed.Moreover,we give theoretical proofs for the properties of OAHD,including skew insensitivity and the ability to seek a purer node in the decision tree.Finally,we collect 20 public real-world imbalanced data sets from the Knowledge Extraction based on Evolutionary Learning(KEEL)repository and the University of California,Irvine(UCI)repository.Experimental and statistical results show that OAHD significantly improves the performance compared with the five other well-known decision trees in terms of Precision,F-measure,and multiclass area under the receiver operating characteristic curve(MAUC).Moreover,through statistical analysis,the Friedman and Nemenyi tests are used to prove the advantage of OAHD over the five other decision trees. 展开更多
关键词 Decision trees Multiclass imbalanced learning Node splitting criterion Hellinger distance one-against-all scheme
原文传递
基于模糊支持向量机的多标签分类方法改进 被引量:1
2
作者 郭晨晨 朱红康 《甘肃科学学报》 2017年第6期6-10,共5页
One-against-all支持向量机的多标签分类存在将样本分类到训练集无法获取标签的"未定义"区域和没有明确决策函数的标签模糊区域的问题。对此提出一种基于模糊支持向量机的多标签分类改进方法(FSVMi)。该方法通过将多条决策边... One-against-all支持向量机的多标签分类存在将样本分类到训练集无法获取标签的"未定义"区域和没有明确决策函数的标签模糊区域的问题。对此提出一种基于模糊支持向量机的多标签分类改进方法(FSVMi)。该方法通过将多条决策边界合并,并为每个标签类分配相应的隶属函数。实验结果表明,相比于现有方法,该方法更具有优越性。 展开更多
关键词 one-against-all 模糊支持向量机 多标签分类 决策边界 隶属函数
下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部