期刊文献+

稀疏子空间的粒子滤波法在目标跟踪中的应用 被引量:3

Application of sparse subspace based particle filtering in target tracking
下载PDF
导出
摘要 针对目标跟踪算法中对光流变化较敏感、对目标的内蕴变形以及外观变化鲁棒性低等问题,将压缩感知理论及稀疏子空间表示方法应用于目标跟踪。对跟踪目标压缩采样,根据压缩感知原理采样后的信号在较高的概率下保持原有的信息量的特点,为目标跟踪提供足够的信息保证,利用鲁棒主成分分析(robust PCA,RPCA)方法提取压缩采样目标的稀疏主成分;在粒子滤波理论框架下实现跟踪过程,利用过去连续时间序列下目标的状态估计当前时刻目标的状态;在对目标所在的稀疏子空间更新的处理中,利用子空间与候选目标的相似性更新子空间。实验结果表明,该算法具有较强的鲁棒性,较好地满足了实时性要求。 To improve target tracking robustness to challenges such as sensitivity to optical flow,changes of appearance which used to model the target,an efficient tracking algorithm composed of compressed sparse representation of the target and particle filtering was proposed.The object was sampled compressively with the theoretical guarantee that the original signals can be preserved with high probability,which was enough for target tracking.The sparse subspace was extracted from the compressed images with RPCA.The particle filtering was used to achieve tracking process,namely,the target state was predicted with the bypast information in new frame.Sparse subspace was updated through measuring the similarity to the candidate target.Experimental results show that the proposed algorithm is robust and meets real-time request.
作者 贺若彬 武德安 吴磊 岳翰 HE Ruo-bin WU De-an WU Lei YUE Han(College of Mathematical Science, University of Electronic Science and Technology of China, Chengdu 611731, China)
出处 《计算机工程与设计》 北大核心 2016年第11期3080-3085,共6页 Computer Engineering and Design
基金 国家自然科学基金项目(71501025)
关键词 目标跟踪 压缩感知 稀疏子空间 稀疏表示 粒子滤波 不变特征 鲁棒主成分分析 target tracking compressed sensing sparse subspace sparse representation particle filtering invariant features RPCA
  • 相关文献

参考文献3

二级参考文献36

  • 1张金国,吕庭豪.基于MATAB的改进模板匹配方法的机器人目标识别系统[J].机器人,2003,25(z1):623-625. 被引量:4
  • 2宋仁庭,杨卫平,杨明月.模板匹配算法对运动目标自动锁定跟踪的研究[J].红外与激光工程,2007,36(z2):197-200. 被引量:5
  • 3李良群,姬红兵,罗军辉.迭代扩展卡尔曼粒子滤波器[J].西安电子科技大学学报,2007,34(2):233-238. 被引量:60
  • 4黄飞,李德华,姚迅.基于相关匹配及自适应模板更新的目标跟踪新方法[J].计算机工程,2007,33(16):147-149. 被引量:12
  • 5KYRIACOU T, BUGMANN G, LAURIA S. Vision-based urban navigation procedures for verbally instructed robots[ J]. Robotics and Autonomous Systems, 2005, 51(1) : 69 -80.
  • 6COMANICIU D, RAMESH V, MEER P. Kernel-based object tracking[ J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2003, 25(5) : 564 -577.
  • 7ROSS D A, LIM J, LIN R S, et al. Incremental learning for robust visual tracking[ J]. International Journal of Computer Vision, 2008, 77( 1/2/) : 195-141.
  • 8LEE K, HO J, YANG M, et al. The honda/UCSD video database. [ 2011 -02 -20]. http://vision, ucsd. edu/~ leekc/HondaUCSDVideoDatabase/HondaUCSD. html.
  • 9LEE K C, HO J, YANG M H, et al. Visual tracking and recognition using probabilistic appearance manifolds[ J]. Computer Vision and Image Understanding, 2005, 99(3) : 303 -331.
  • 10Guang Shu,Afshin Dehghan,Omar Oreifej.Part-based multiple-person tracking with partial occlusion handling[C]//Computer Vision and Pattern Recognition,2012:1815-1821.

共引文献197

同被引文献27

引证文献3

二级引证文献16

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部