摘要
本文讨论了三种确定无信息先验分布的方法。利用熵不等式,简化了Zellnor(1984)关于最大数据信息先验分布的一条定理的证明,并得到了其唯一性。在此基础上提出了广义最大熵先验分布,它在某些方面改进了经典的最大熵原理,对于最大相关先验分布(即Lindley准则),我们得到了一个中间解,由此导出了许多常见分布族的最大相关先验分布。
This paper is concerned with there procedures to derive prior distributions with knowledge of parameters in terms. By means of the entropy inequality, we simplify the proof of Zellner (1984) and get the uniqueness for the maximal data information prior. A generalized ma- ximal entropy prior is proposed, which improves the olassical maximal entropy principle in some respects. An intermediate solution for the maximal relative prior is developed, from which the maximal relative prior densities for a great number of distribution families are presented.
出处
《应用概率统计》
CSCD
北大核心
1991年第2期192-200,共9页
Chinese Journal of Applied Probability and Statistics