摘要
针对大样本条件下EM算法学习贝叶斯网络参数的计算问题,提出一种并行EM算法(Parallel EM,PL-EM)提高大样本条件下复杂贝叶斯网络参数学习的速度.PL-EM算法在E步并行计算隐变量的后验概率和期望充分统计因子;在M步,利用贝叶斯网络的条件独立性和完整数据集下的似然函数可分解性,并行计算各个局部似然函数.实验结果表明PL-EM为解决大样本条件下贝叶斯网络参数学习提供了一种有效的方法.
Because the EM algorithm requires significant computational resources for Bayesian Networks parameter learning under large databases, the PL-EM algorithm is proposed to improve the learning speed. The PL-EM algorithm parallel computes the posteriori probabilities of hidden variables and expected sufficient statistics at E step,at M step, the algorithm makes use of conditional independence and the decomposability of the likelihood function to parallel compute each local likelihood function. Experimental results show the PL-EM algorithm is an effective method to solve Bayesian parameter learning for large datasets.
出处
《小型微型计算机系统》
CSCD
北大核心
2007年第11期1972-1975,共4页
Journal of Chinese Computer Systems
基金
国家自然科学基金项目(60575023)资助
教育部博士点基金项目(20050359012)资助.