期刊文献+

基于Transformer的不可知类计数方法

Class-agnostic Counting Based on the Transformer
下载PDF
导出
摘要 针对传统不可知类计数方法模型复杂、参数量大及特征空间不一致等问题,提出一种基于Transformer的不可知类计数方法,同时执行特征提取和相似度匹配。引入提取并匹配模块、Patch编码模块及尺度编码3个模块,提取并匹配模块利用集成自注意力,简化流程,统一了查询图像和样本间的特征空间;Patch编码模块提高了计数效率及定位能力;尺度编码模块补偿了给定样本缩放后的尺寸信息损失。实验结果表明,与CounTR方法相比,在FSC-147验证集上,本方法平均绝对误差降低了8.45%,均方根误差降低了5.38%,有效性和泛化性得到了提高。 Addressing the issues of complex models,large parameter quantities,and inconsistent feature spaces,we proposed a CAC method based on Transformer that simultaneously performed feature extraction and similarity matching.We introduced three modules:Extract-and-Match module,Patch Embedding module,and Scale Embedding module.An Extract-and-Match module was designed by using the self-attention mechanism to effectively simplify the process and unify the feature spaces between query images and exemplars.To improve counting efficiency and localization capabilities a Patch Embedding module was designed.Additionally,a Scale Embedding module was added to compensate for scale information loss due to resizing the given exemplars.Extensive experiments demonstrated that our method achieved optimization in effectiveness and generalization performance compared to the CounTR,while the MAE was reduced by 8.45%,and the RMSE was reduced by 13.64%on the FSC-147 validation set.
作者 程子源 王国栋 CHENG Ziyuan;WANG Guodong(College of Computer Science&Technology,Qingdao University,Qingdao 266071,China)
出处 《青岛大学学报(工程技术版)》 CAS 2024年第2期17-23,共7页 Journal of Qingdao University(Engineering & Technology Edition)
基金 青岛市自然科学基金资助项目(23-2-1-163-zyyd-jch)。
关键词 不可知类计数 TRANSFORMER 自注意力机制 提取并匹配范式 class-agnostic counting Transformer self-attention mechanism the paradigm of extraction and matching
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部