期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
Vector Approximate Message Passing with Sparse Bayesian Learning for Gaussian Mixture Prior 被引量:2
1
作者 chengyao ruan Zaichen Zhang +3 位作者 Hao Jiang Jian Dang Liang Wu Hongming Zhang 《China Communications》 SCIE CSCD 2023年第5期57-69,共13页
Compressed sensing(CS)aims for seeking appropriate algorithms to recover a sparse vector from noisy linear observations.Currently,various Bayesian-based algorithms such as sparse Bayesian learning(SBL)and approximate ... Compressed sensing(CS)aims for seeking appropriate algorithms to recover a sparse vector from noisy linear observations.Currently,various Bayesian-based algorithms such as sparse Bayesian learning(SBL)and approximate message passing(AMP)based algorithms have been proposed.For SBL,it has accurate performance with robustness while its computational complexity is high due to matrix inversion.For AMP,its performance is guaranteed by the severe restriction of the measurement matrix,which limits its application in solving CS problem.To overcome the drawbacks of the above algorithms,in this paper,we present a low complexity algorithm for the single linear model that incorporates the vector AMP(VAMP)into the SBL structure with expectation maximization(EM).Specifically,we apply the variance auto-tuning into the VAMP to implement the E step in SBL,which decrease the iterations that require to converge compared with VAMP-EM algorithm when using a Gaussian mixture(GM)prior.Simulation results show that the proposed algorithm has better performance with high robustness under various cases of difficult measurement matrices. 展开更多
关键词 sparse Bayesian learning approximate message passing compressed sensing expectation propagation
下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部