期刊文献+

基于角点的稀疏子空间粒子滤波跟踪算法 被引量:1

PARTICLE FILTER TRACKING ALGORITHM BASED ON CORNER FOR SPARSE SUBSPACE
下载PDF
导出
摘要 大多数现有的稀疏表示目标跟踪方法计算复杂度高,对琐碎模板采用直接将整个图像块按顺序划分成若干个小块放入模板中,引入了背景信息,容易跟踪漂移,甚至丢失目标的情况。针对这种情况,提出基于角点的稀疏子空间粒子滤波跟踪方法。通过Shi-Tomasi角点的方法提取特征点,选择其中特征点最多的图像块建立琐碎模板,对相似度高的仅采用基向量对候选目标进行误差运算,增量更新基向量子空间,否则将正交基向量结合琐碎模板组成字典并加入从高到低的权值进行误差计算,同时更新琐碎模板。与当前多个目标跟踪算法相比,定性和定量实验结果均表明,该算法在目标发生遮挡、旋转、尺度变化、快速运动、光照变化等复杂情况下跟踪效果更好,实用性更强。 Most of the existing object tracking methods based on sparse representation have high computational complexity. For trivial templates, it is easy to track drift and even lose the target if the whole block is directly divided into several blocks into a template, which will introduce background information. In this case, a sparse subspace particle filter tracking method based on corner points is proposed. Extracting feature points through the Shi-Tomasi corner point method, then choosing the image block with most feature points to establish trivial template. For high similarity, only base vectors are used to perform error operations on the candidate targets, and the base vectors subspace is updated incrementally. Otherwise, the orthogonal base vector is combined with a trivial template to make up the dictionary and add the weight, which is from high to low, to calculate the error and update the trivial template at the same time. The qualitative and quantitative experiment both show, compared with the current multiple target tracking algorithms, the proposed algorithm is more precise and practical in complicated situations, such as occlusion, rotation, scale change, fast motion and illumination change.
作者 王旭阳 朱志林 Wang Xuyang;Zhu Zhilin(College of Computer and Communication,Lanzhou University of Technology,Lanzhou 730050,Gansu,China)
出处 《计算机应用与软件》 北大核心 2018年第9期236-241,共6页 Computer Applications and Software
基金 国家自然科学基金项目(61563030)
关键词 目标跟踪 稀疏表示 子空间 粒子滤波 特征点 Object tracking Sparse representation Subspace Particle filter Feature points
  • 相关文献

参考文献7

二级参考文献90

  • 1叶龙,王京玲,张勤.遗传重采样粒子滤波器[J].自动化学报,2007,33(8):885-887. 被引量:43
  • 2Cannons K. A review of visual tracking [Online], available: http://www.cse.yorku.ca/techreports/2008/?abstract=CSE- 2008-07, Oct 31, 2010.
  • 3Tu J, Tao H, Huang T. Online updating appearance generative mixture model for mean shift tracking. Lecture Notes in Computer Science. New York: Springer, 2006. 694-703.
  • 4Shan C, Tan T, Wei Y. Real-time hand tracking using a mean shift embedded particle filter. Pattern Recognition, 2007, 40(7): 1958-1970.
  • 5Wang X, Wang S, Ma J. An improved particle filter for target tracking in sensor systems. Sensors, 2007, 7(1): 144-156.
  • 6Wang T, Gu I, Backhouse A, Shi P. Face tracking using Rao- Blackwellized particle filter and pose-dependent probabilistic PCA. In: Proceedings of the 15th IEEE International Conference on Image Processing. San Diego, USA: IEEE, 2008. 853-856.
  • 7Silveira G, Malis E. Real-time visual tracking under arbitrary illumination changes. In: Proceedings of the IEEE Conference on Computer Vision and Patter Recognition. Minneapolis, USA: IEEE, 2007. 1-6.
  • 8Buenaposada J, Munoz E, Baumela L. Efficient illumination independent appearance-based face tracking. Image and Vision Computing, 2009, 27(5): 560-578.
  • 9Kwon J, Lee K M. Visual tracking decomposition. In: Proceedings of the IEEE Conference on Computer Vision and Patter Recognition. San Francisco, USA: IEEE, 2010. 1269 - 1276.
  • 10Yu Q, Dinh T B, Medioni G. Online tracking and reacquisition using co-trained generative and discriminative trackers. Lecture Notes in Computer Science. Berlin: Springer, 2008. 678-691.

共引文献69

同被引文献6

引证文献1

二级引证文献4

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部