期刊文献+

Variational learning for finite Beta-Liouville mixture models

Variational learning for finite Beta-Liouville mixture models
原文传递
导出
摘要 In the article, an improved variational inference (VI) framework for learning finite Beta-Liouville mixture models (BLM) is proposed for proportional data classification and clustering. Within the VI framework, some non-linear approximation techniques are adopted to obtain the approximated variational object functions. Analytical solutions are obtained for the variational posterior distributions. Compared to the expectation maximization (EM) algorithm which is commonly used for learning mixture models, underfitting and overfitting events can be prevented. Furthermore, parameters and complexity of the mixture model (model order) can be estimated simultaneously. Experiment shows that both synthetic and real-world data sets are to demonstrate the feasibility and advantages of the proposed method. In the article, an improved variational inference (VI) framework for learning finite Beta-Liouville mixture models (BLM) is proposed for proportional data classification and clustering. Within the VI framework, some non-linear approximation techniques are adopted to obtain the approximated variational object functions. Analytical solutions are obtained for the variational posterior distributions. Compared to the expectation maximization (EM) algorithm which is commonly used for learning mixture models, underfitting and overfitting events can be prevented. Furthermore, parameters and complexity of the mixture model (model order) can be estimated simultaneously. Experiment shows that both synthetic and real-world data sets are to demonstrate the feasibility and advantages of the proposed method.
出处 《The Journal of China Universities of Posts and Telecommunications》 EI CSCD 2014年第2期98-103,共6页 中国邮电高校学报(英文版)
基金 supported by the National Natural Science Foundation of China(61303232,61363085,61121061,60972077) the Hi-Tech Research and Development Program of China(2009AA01Z430)
关键词 variational inference model selection factorized approximation Beta-Liouville distribution mixing modeling variational inference, model selection, factorized approximation, Beta-Liouville distribution, mixing modeling
  • 相关文献

参考文献9

  • 1McLachlan G J, Peel D. Finite mixture models. New York, NY, USA: Wiley, 2000.
  • 2Bouguila N. Hybrid generative/discriminative approaches for proportional data modeling and classification. IEEE Transactions on Knowledge and Data Engineering, 2012, 24(12): 2184-2202.
  • 3Corduneanu A, Bishop C M. Variational Bayesian model selection for mixture distributions. Proceeding of the 8th International Conference on Artificial Intelligence and Statistics (AlSTATS'OI), Jan 3---{), 2001, Key West, FL, USA. San Francisco, CA, USA: Morgan Kaufinann, 2001: 27-34.
  • 4Ma Z Y, Leijon A. Bayesian estimation of Beta mixture models with variational inference. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2011, 33 (33): 2160-2173.
  • 5Bouguila N. Bayesian hybrid generative discriminative learning based on finite Liouville mixture models. Pattern Recognition, 2011, 44(6): 1183-1200.
  • 6Fan W T, Bouguila N, Ziou D. Variational learning for fmite dirichlet mixture models and applications. IEEE Transactions on Neural Networks and Learning Systems, 2012, 23(5): 762-774.
  • 7Bishop C M. Pattern recognition and machine learning. New York, NY, USA: Springer-Verlag, 2006.
  • 8Boyd S, P Vandenberghe L. Convex optimization. Cambridge, UK: Carnbridge University Press, 2004.
  • 9Blei D M, Ng A Y, Jordan M I. Latent dirichlet allocation. Journal of Machine Learning Research, 2003, 3: 993-1022.

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部