摘要
针对稀疏谱估计中,基于协方差的无格点稀疏迭代估计算法(GLS)对频率间隔在c(1<c<2)倍采样数据窗主瓣宽度内的谱信号估计精度较低的问题,本文提出了一种对异常值鲁棒的无格点谱估计算法.该算法首先将GLS算法的频率估计误差描述成误差向量,再利用l1范数对异常值的鲁棒性,来分别约束信号幅度和误差向量的拟合误差,进而通过交替迭代来使它们同时到达最小,以同时实现了对信号幅度和误差向量的联合鲁棒估计.在证明了所提出算法收敛性的同时,分析了它的运算复杂度.仿真实验结果验证了本文提出算法的有效性.
It has been understood that,if the frequency intervals of line spectral signals are within c-fold(1c2)main lobe width of the observation window,the performance of the gridless sparse iterative covariance-based estimation(SPICE),also called as the GLS,algorithm will be degraded.To solve such a problem,a new outlier robust and grid free spectral estimation(called as ORGFSE)algorithm for line spectral estimations is proposed in this paper.Our algorithm first describes the frequency estimation errors of the GLS as a vector.To make the algorithm be robust against outliers,the l1 norm constraints are then applied to the fitting errors of the signal amplitudes and estimation error vector of the GLS respectively.It is shown that both the amplitudes and estimation errors of the GLS can effectively and robustly be estimated by iteratively minimizing these l1 norms alternatively.While the convergence of the propose algorithms is given,its computational complexities are analyzed.The numerical simulation results verify the effectiveness of the proposed algorithm.
出处
《复旦学报(自然科学版)》
CAS
CSCD
北大核心
2018年第1期92-99,106,共9页
Journal of Fudan University:Natural Science
基金
国家自然科学基金(61571131)
关键词
无格点谱估计
异常值
L1范数
grid free spectral estimations
outliers
l1 norm