This paper addresses a cooperative relative navigation problem for multiple aerial agents,relying on visual tracking information between vehicles.The research aims to investigate a sensor fusion architecture and algor...This paper addresses a cooperative relative navigation problem for multiple aerial agents,relying on visual tracking information between vehicles.The research aims to investigate a sensor fusion architecture and algorithm that leverages partially available absolute navigation knowledge while exploiting collaborative visual interaction between vehicles in mission flight areas,where satellite navigation-denied regions are irregularly located.To achieve this,the paper introduces a new approach to defining the relative poses of cameras and develops a corresponding process to secure the relative pose information.This contrasts with previous research,which simply linearized the relative pose information of aircraft cameras into navigation states defined in an absolute coordinate system.Specifically,the target pose in relative navigation is defined,and the pose of the camera and feature points are directly derived using dual quaternion representation,which compactly represents both translation and rotation.Furthermore,a mathematical model for the relative pose of the camera is derived through the dual quaternion framework,enabling an explicit pose formulation of relative navigation.The study investigates navigation performance in typical mission flight scenarios using an in-house high-fidelity simulator and quantitatively highlights the contributions of the proposed scheme by comparing the navigation error performance.Consequently,the proposed method demonstrates to have navigation accuracy in decimeter level even in GNSS-denied environments and an improved 3D Root Mean Square(RMS)error by30%smaller than the conventional absolute navigation framework.展开更多
基金supported by the Sejong Fellowship Program,South Korea(No.NRF-2022R1C1C2009014)the Basic Research Program(No.NRF-2022R1A2C1005237)from Korean National Research Fund。
文摘This paper addresses a cooperative relative navigation problem for multiple aerial agents,relying on visual tracking information between vehicles.The research aims to investigate a sensor fusion architecture and algorithm that leverages partially available absolute navigation knowledge while exploiting collaborative visual interaction between vehicles in mission flight areas,where satellite navigation-denied regions are irregularly located.To achieve this,the paper introduces a new approach to defining the relative poses of cameras and develops a corresponding process to secure the relative pose information.This contrasts with previous research,which simply linearized the relative pose information of aircraft cameras into navigation states defined in an absolute coordinate system.Specifically,the target pose in relative navigation is defined,and the pose of the camera and feature points are directly derived using dual quaternion representation,which compactly represents both translation and rotation.Furthermore,a mathematical model for the relative pose of the camera is derived through the dual quaternion framework,enabling an explicit pose formulation of relative navigation.The study investigates navigation performance in typical mission flight scenarios using an in-house high-fidelity simulator and quantitatively highlights the contributions of the proposed scheme by comparing the navigation error performance.Consequently,the proposed method demonstrates to have navigation accuracy in decimeter level even in GNSS-denied environments and an improved 3D Root Mean Square(RMS)error by30%smaller than the conventional absolute navigation framework.