期刊文献+

An Efficient Indoor Localization Based on Deep Attention Learning Model

下载PDF
导出
摘要 Indoor localization methods can help many sectors,such as healthcare centers,smart homes,museums,warehouses,and retail malls,improve their service areas.As a result,it is crucial to look for low-cost methods that can provide exact localization in indoor locations.In this context,imagebased localization methods can play an important role in estimating both the position and the orientation of cameras regarding an object.Image-based localization faces many issues,such as image scale and rotation variance.Also,image-based localization’s accuracy and speed(latency)are two critical factors.This paper proposes an efficient 6-DoF deep-learning model for image-based localization.This model incorporates the channel attention module and the Scale PyramidModule(SPM).It not only enhances accuracy but also ensures the model’s real-time performance.In complex scenes,a channel attention module is employed to distinguish between the textures of the foregrounds and backgrounds.Our model adapted an SPM,a feature pyramid module for dealing with image scale and rotation variance issues.Furthermore,the proposed model employs two regressions(two fully connected layers),one for position and the other for orientation,which increases outcome accuracy.Experiments on standard indoor and outdoor datasets show that the proposed model has a significantly lower Mean Squared Error(MSE)for both position and orientation.On the indoor 7-Scenes dataset,the MSE for the position is reduced to 0.19 m and 6.25°for the orientation.Furthermore,on the outdoor Cambridge landmarks dataset,the MSE for the position is reduced to 0.63 m and 2.03°for the orientation.According to the findings,the proposed approach is superior and more successful than the baseline methods.
出处 《Computer Systems Science & Engineering》 SCIE EI 2023年第8期2637-2650,共14页 计算机系统科学与工程(英文)
基金 This work was funded by the Deanship of Scientific Research at Jouf University under grant No(DSR-2021-02-0379).
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部