期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
Variational learning for finite Beta-Liouville mixture models
1
作者 LAI Yu-ping ZHOU Ya-jian +2 位作者 PING Yuan GUO Yu-cui YANG Yi-xian 《The Journal of China Universities of Posts and Telecommunications》 EI CSCD 2014年第2期98-103,共6页
In the article, an improved variational inference (VI) framework for learning finite Beta-Liouville mixture models (BLM) is proposed for proportional data classification and clustering. Within the VI framework, so... In the article, an improved variational inference (VI) framework for learning finite Beta-Liouville mixture models (BLM) is proposed for proportional data classification and clustering. Within the VI framework, some non-linear approximation techniques are adopted to obtain the approximated variational object functions. Analytical solutions are obtained for the variational posterior distributions. Compared to the expectation maximization (EM) algorithm which is commonly used for learning mixture models, underfitting and overfitting events can be prevented. Furthermore, parameters and complexity of the mixture model (model order) can be estimated simultaneously. Experiment shows that both synthetic and real-world data sets are to demonstrate the feasibility and advantages of the proposed method. 展开更多
关键词 variational inference model selection factorized approximation beta-liouville distribution mixing modeling
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部