期刊文献+

混合高斯参数估计的动态簇算法 被引量:1

Dynamic cluster algorithm for Gaussian mixture parameter estimation
下载PDF
导出
摘要 混合高斯概率密度模型可以很好地拟合非高斯样本的概率密度。在各高斯分量概率密度互不重叠的条件下,使用动态簇算法可以快速而精确地估计出混合高斯概率密度模型参数。这是一种基于最小均方差原则的递推算法,在正向推导出各种可能的簇边界后,再根据确定的最末边界值逆向推定各前导簇边界,从而得到混合高斯概率密度模型参数估计值。描述模型及参数估计问题之后,动态簇算法被推导出来。然后深入探讨了该算法的实质及适用条件。最后结合数值仿真实例,分析了动态簇算法的估计性能。 Probability density of non-Gaussian processes can be well fit by Gaussian mixture model whose parameters can be estimated through the dynamic clutter algorithm that is a recursion on the principle of minimum mean-square deviation. All possible boundaries of each clutter are forward-derived. Since the right boundary of final clutter is determinate, boundaries of previous clutters can be recurred backwards one by one. Thus the Gaussian mixture parameters are estimated. After descriptions of the model and the estimation problem, dynamic clutter algorithm for Gaussian mixture parameters is obtained. Its essential ideas and applicability are discussed in detail. A numerical example is presented to study the performance of estimation.
出处 《声学技术》 CSCD 北大核心 2007年第4期741-746,共6页 Technical Acoustics
基金 国家973基金项目(5132102ZZT32)
关键词 混合高斯 累积方差 动态簇算法 Gaussian mixture cumulative variance dynamic cluster algorithm
  • 相关文献

参考文献3

  • 1Redner R A,Walker H F.Mixture densities,maximum like-lihood,and the EM algorithm[J].SIAM Review,1984,26:195-209.
  • 2Aaron A D' Souza.Using EM to estimate a probability density with a mixture of Gaussians[DB/OL].http:// citeseer.ist.psu.edu,2000-1.
  • 3ZHAO Yunxin,ZHUANG Xinhua,TING Shenjin.Gaussian mixture density modeling of non-Gaussian source for autoregressive process[J].IEEE Transactions on Signal Processing,1995,43(4):894-903.

同被引文献15

  • 1Nasri A, Schober R. Performance of BICM-SC and BICM-OFDM systems with diversity reception in non- Gaussian noise and interference[J]. IEEE Transac- tions on Communications, 2009, 57(11): 3316-3327.
  • 2He J, Liu Z. Underwater acoustic azimuth and eleva- tion angle estimation using spatial invariance of two identically oriented vector hydrophones at unknown locations in impulsive noise[J] Digital Signal Pro- cessing, 2009, 19(3): 452-462.
  • 3Bouguila, N. Bayesian hybrid generative discrimina-tive learning based on finite Liouville mixture models [J]. Pattern Recognition, 2011, 44(6): 1183-1200.
  • 4Shen Y, Cornford D. Variational markov chain monte carlo for bayesian smoothing of non-linear diffusions [ J ]. Computational Statistics, 2012, 27 ( 1 ) : 149-176.
  • 5Emtiyaz K M. An expectation-maximization algo- rithm for learning the latent Gaussian model with Gaussian likelihood [EB/OL]. (2011-04-22) [2012- 06-13-]. http://www, cs. ubc. ca/: emtiyaz/Writ- ings/FA1, pdf.
  • 6Attias H. A variational Bayesian framework for graphical models [C] // Advances in Neural Informa- tion Processing Systems 12. Cambridge, MA: Es. n. 1, 2000: 209-215.
  • 7Vrettas M D, Cornford D, Opper M. Estimating pa- rameters in stochastic systems: a variational Bayesian approach [J]. Physica D, 2011, 240(23): 1877- 1900.
  • 8Sun S J, Peng C L, Hou W S. Blind source separa- tion with time series variational Bayes expectation maximization algorithm [J] Digital Signal Process- ing, 2012, 22(1): 17-33.
  • 9Huang Q H, Yang J, Zhou Y. Variational Bayesian method for speech enhancement[J]. Neurocomput- ing, 2007, 70(16-18): 3063-3067.
  • 10Armagan A, Zaretzki R L. A note on mean-field vari- ational approximations in Bayesian probit models [J] Computational Statistics & Data Analysis, 2011, 55 (1) : 641-643.

引证文献1

二级引证文献13

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部