A Kullback-Leibler(KL)distance based algorithm is presented to find the matches between concepts from different ontologies. First, each concept is represented as a specific probability distribution which is estimate...A Kullback-Leibler(KL)distance based algorithm is presented to find the matches between concepts from different ontologies. First, each concept is represented as a specific probability distribution which is estimated from its own instances. Then, the similarity of two concepts from different ontologies is measured by the KL distance between the corresponding distributions. Finally, the concept-mapping relationship between different ontologies is obtained. Compared with other traditional instance-based algorithms, the computing complexity of the proposed algorithm is largely reduced. Moreover, because it proposes different estimation and smoothing methods of the concept distribution for different data types, it is suitable for various concepts mapping with different data types. The experimental results on real-world ontology mapping illustrate the effectiveness of the proposed algorithm.展开更多
To relax the target aspect sensitivity and use more statistical information of the High Range Resolution Profiles (HRRPs), in this paper, the average range profile and the variance range profile are extracted together...To relax the target aspect sensitivity and use more statistical information of the High Range Resolution Profiles (HRRPs), in this paper, the average range profile and the variance range profile are extracted together as the feature vectors for both training data and test data representa-tion. And a decision rule is established for Automatic Target Recognition (ATR) based on the mini-mum Kullback-Leibler Distance (KLD) criterion. The recognition performance of the proposed method is comparable with that of Adaptive Gaussian Classifier (AGC) with multiple test HRRPs, but the proposed method is much more computational efficient. Experimental results based on the measured data show that the minimum KLD classifier is effective.展开更多
Accurate reconstruction from a reduced data set is highly essential for computed tomography in fast and/or low dose imaging applications. Conventional total variation(TV)-based algorithms apply the L1 norm-based pen...Accurate reconstruction from a reduced data set is highly essential for computed tomography in fast and/or low dose imaging applications. Conventional total variation(TV)-based algorithms apply the L1 norm-based penalties, which are not as efficient as Lp(0〈p〈1) quasi-norm-based penalties. TV with a p-th power-based norm can serve as a feasible alternative of the conventional TV, which is referred to as total p-variation(TpV). This paper proposes a TpV-based reconstruction model and develops an efficient algorithm. The total p-variation and Kullback-Leibler(KL) data divergence, which has better noise suppression capability compared with the often-used quadratic term, are combined to build the reconstruction model. The proposed algorithm is derived by the alternating direction method(ADM) which offers a stable, efficient, and easily coded implementation. We apply the proposed method in the reconstructions from very few views of projections(7 views evenly acquired within 180°). The images reconstructed by the new method show clearer edges and higher numerical accuracy than the conventional TV method. Both the simulations and real CT data experiments indicate that the proposed method may be promising for practical applications.展开更多
This paper considers a Kullback-Leibler distance (KLD) which is asymptotically equivalent to the KLD by Goutis and Robert [1] when the reference model (in comparison to a competing fitted model) is correctly specified...This paper considers a Kullback-Leibler distance (KLD) which is asymptotically equivalent to the KLD by Goutis and Robert [1] when the reference model (in comparison to a competing fitted model) is correctly specified and that certain regularity conditions hold true (ref. Akaike [2]). We derive the asymptotic property of this Goutis-Robert-Akaike KLD under certain regularity conditions. We also examine the impact of this asymptotic property when the regularity conditions are partially satisfied. Furthermore, the connection between the Goutis-Robert-Akaike KLD and a weighted posterior predictive p-value (WPPP) is established. Finally, both the Goutis-Robert-Akaike KLD and WPPP are applied to compare models using various simulated examples as well as two cohort studies of diabetes.展开更多
文摘A Kullback-Leibler(KL)distance based algorithm is presented to find the matches between concepts from different ontologies. First, each concept is represented as a specific probability distribution which is estimated from its own instances. Then, the similarity of two concepts from different ontologies is measured by the KL distance between the corresponding distributions. Finally, the concept-mapping relationship between different ontologies is obtained. Compared with other traditional instance-based algorithms, the computing complexity of the proposed algorithm is largely reduced. Moreover, because it proposes different estimation and smoothing methods of the concept distribution for different data types, it is suitable for various concepts mapping with different data types. The experimental results on real-world ontology mapping illustrate the effectiveness of the proposed algorithm.
基金Partially supported by the National Natural Science Foundation of China (No.60302009).
文摘To relax the target aspect sensitivity and use more statistical information of the High Range Resolution Profiles (HRRPs), in this paper, the average range profile and the variance range profile are extracted together as the feature vectors for both training data and test data representa-tion. And a decision rule is established for Automatic Target Recognition (ATR) based on the mini-mum Kullback-Leibler Distance (KLD) criterion. The recognition performance of the proposed method is comparable with that of Adaptive Gaussian Classifier (AGC) with multiple test HRRPs, but the proposed method is much more computational efficient. Experimental results based on the measured data show that the minimum KLD classifier is effective.
基金Project supported by the National Natural Science Foundation of China(Grant Nos.61372172 and 61601518)
文摘Accurate reconstruction from a reduced data set is highly essential for computed tomography in fast and/or low dose imaging applications. Conventional total variation(TV)-based algorithms apply the L1 norm-based penalties, which are not as efficient as Lp(0〈p〈1) quasi-norm-based penalties. TV with a p-th power-based norm can serve as a feasible alternative of the conventional TV, which is referred to as total p-variation(TpV). This paper proposes a TpV-based reconstruction model and develops an efficient algorithm. The total p-variation and Kullback-Leibler(KL) data divergence, which has better noise suppression capability compared with the often-used quadratic term, are combined to build the reconstruction model. The proposed algorithm is derived by the alternating direction method(ADM) which offers a stable, efficient, and easily coded implementation. We apply the proposed method in the reconstructions from very few views of projections(7 views evenly acquired within 180°). The images reconstructed by the new method show clearer edges and higher numerical accuracy than the conventional TV method. Both the simulations and real CT data experiments indicate that the proposed method may be promising for practical applications.
文摘This paper considers a Kullback-Leibler distance (KLD) which is asymptotically equivalent to the KLD by Goutis and Robert [1] when the reference model (in comparison to a competing fitted model) is correctly specified and that certain regularity conditions hold true (ref. Akaike [2]). We derive the asymptotic property of this Goutis-Robert-Akaike KLD under certain regularity conditions. We also examine the impact of this asymptotic property when the regularity conditions are partially satisfied. Furthermore, the connection between the Goutis-Robert-Akaike KLD and a weighted posterior predictive p-value (WPPP) is established. Finally, both the Goutis-Robert-Akaike KLD and WPPP are applied to compare models using various simulated examples as well as two cohort studies of diabetes.