This paper presents a new online incremental training algorithm of Gaussian mixture model (GMM), which aims to perform the expectation-maximization(EM) training incrementally to update GMM model parameters online ...This paper presents a new online incremental training algorithm of Gaussian mixture model (GMM), which aims to perform the expectation-maximization(EM) training incrementally to update GMM model parameters online sample by sample, instead of waiting for a block of data with the sufficient size to start training as in the traditional EM procedure. The proposed method is extended from the split-and-merge EM procedure, so inherently it is also capable escaping from local maxima and reducing the chances of singularities. In the application domain, the algorithm is optimized in the context of speech processing applications. Experiments on the synthetic data show the advantage and efficiency of the new method and the results in a speech processing task also confirm the improvement of system performance.展开更多
文摘This paper presents a new online incremental training algorithm of Gaussian mixture model (GMM), which aims to perform the expectation-maximization(EM) training incrementally to update GMM model parameters online sample by sample, instead of waiting for a block of data with the sufficient size to start training as in the traditional EM procedure. The proposed method is extended from the split-and-merge EM procedure, so inherently it is also capable escaping from local maxima and reducing the chances of singularities. In the application domain, the algorithm is optimized in the context of speech processing applications. Experiments on the synthetic data show the advantage and efficiency of the new method and the results in a speech processing task also confirm the improvement of system performance.