期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
A Kullback-Leibler Divergence for Bayesian Model Diagnostics
1
作者 chen-pin wang Malay Ghosh 《Open Journal of Statistics》 2011年第3期172-184,共13页
This paper considers a Kullback-Leibler distance (KLD) which is asymptotically equivalent to the KLD by Goutis and Robert [1] when the reference model (in comparison to a competing fitted model) is correctly specified... This paper considers a Kullback-Leibler distance (KLD) which is asymptotically equivalent to the KLD by Goutis and Robert [1] when the reference model (in comparison to a competing fitted model) is correctly specified and that certain regularity conditions hold true (ref. Akaike [2]). We derive the asymptotic property of this Goutis-Robert-Akaike KLD under certain regularity conditions. We also examine the impact of this asymptotic property when the regularity conditions are partially satisfied. Furthermore, the connection between the Goutis-Robert-Akaike KLD and a weighted posterior predictive p-value (WPPP) is established. Finally, both the Goutis-Robert-Akaike KLD and WPPP are applied to compare models using various simulated examples as well as two cohort studies of diabetes. 展开更多
关键词 Kullback-Leibler Distance Model DIAGNOSTIC WEIGHTED POSTERIOR PREDICTIVE P-VALUE
下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部