摘要
针对Transformer检测模型计算复杂度高以及检测效率低的问题,提出一种轻量化的动态Transformer目标检测改进算法。首先,在自注意力模块中引入动态门来筛选重要的关注区域,设计了从局部到全局的动态稀疏自注意力机制,在减轻计算负载的同时增强模型的多尺度泛化能力;其次,在模型结构层面上引入了动态跳层机制,使模型在推理过程中能够根据输入自适应调整参数和结构,在检测速率与精度之间取得更好的权衡。实验结果表明,改进后检测模型的计算冗余有效降低,相比现有的基准模型更加高效,实际应用空间更加广阔。
To solve the problems of high computational complexity and low detection efficiency of Transformer-based detection model a lightweight dynamic Transformer object detection algorithm is proposed.Firstly the dynamic gate strategy is introduced to filter important regions in the self-attention module and a local-to-global dynamic sparse self-attention mechanism is designed which enhances the multi-scale generalization capability of the model while reducing the computational load.Secondly dynamic layer-skipping mechanism is introduced at the structural level of the model.Then the model is able to adaptively adjust the parameters and structure according to the input during inference to achieve a better trade-off between detection efficiency and accuracy.The experimental results demonstrate that the improved detection model effectively reduces the computational redundancy which is more efficient and has a broader practical application space compared with the existing benchmark models.
作者
方思凯
孙广玲
陆小锋
刘学锋
FANG Sikai;SUN Guangling;LU Xiaofeng;LIU Xuefeng(Shanghai University,Shanghai 200000 China)
出处
《电光与控制》
CSCD
北大核心
2024年第2期52-57,共6页
Electronics Optics & Control
基金
上海市“科技创新行动计划”自然科学基金(21511102605)。