期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
One-against-all-based Hellinger distance decision tree for multiclass imbalanced learning
1
作者 Minggang DONG Ming LIU Chao JING 《Frontiers of Information Technology & Electronic Engineering》 SCIE EI CSCD 2022年第2期278-290,共13页
Since traditional machine learning methods are sensitive to skewed distribution and do not consider the characteristics in multiclass imbalance problems,the skewed distribution of multiclass data poses a major challen... Since traditional machine learning methods are sensitive to skewed distribution and do not consider the characteristics in multiclass imbalance problems,the skewed distribution of multiclass data poses a major challenge to machine learning algorithms.To tackle such issues,we propose a new splitting criterion of the decision tree based on the one-against-all-based Hellinger distance(OAHD).Two crucial elements are included in OAHD.First,the one-against-all scheme is integrated into the process of computing the Hellinger distance in OAHD,thereby extending the Hellinger distance decision tree to cope with the multiclass imbalance problem.Second,for the multiclass imbalance problem,the distribution and the number of distinct classes are taken into account,and a modified Gini index is designed.Moreover,we give theoretical proofs for the properties of OAHD,including skew insensitivity and the ability to seek a purer node in the decision tree.Finally,we collect 20 public real-world imbalanced data sets from the Knowledge Extraction based on Evolutionary Learning(KEEL)repository and the University of California,Irvine(UCI)repository.Experimental and statistical results show that OAHD significantly improves the performance compared with the five other well-known decision trees in terms of Precision,F-measure,and multiclass area under the receiver operating characteristic curve(MAUC).Moreover,through statistical analysis,the Friedman and Nemenyi tests are used to prove the advantage of OAHD over the five other decision trees. 展开更多
关键词 Decision trees Multiclass imbalanced learning Node splitting criterion hellinger distance One-against-all scheme
原文传递
Derivation of quantum Chernoff metric with perturbation expansion method
2
作者 钟伟 马健 +1 位作者 刘京 王晓光 《Chinese Physics B》 SCIE EI CAS CSCD 2014年第9期81-86,共6页
We investigate a measure of distinguishability defined by the quantum Chernoff bound, which naturally induces the quantum Chernoff metric over a manifold of quantum states. Based on a quantum statistical model, we alt... We investigate a measure of distinguishability defined by the quantum Chernoff bound, which naturally induces the quantum Chernoff metric over a manifold of quantum states. Based on a quantum statistical model, we alternatively derive this metric by means of perturbation expansion. Moreover, we show that the quantum Chernoff metric coincides with the infinitesimal form of the quantum Hellinger distance, and reduces to the variant version of the quantum Fisher information for the single-parameter case. We also give the exact form of the quantum Chernoff metric for a qubit system containing a single parameter. 展开更多
关键词 quantum Chernoff metric hellinger distance perturbation expansion
下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部