No CrossRef data available.
Published online by Cambridge University Press: 04 June 2025
This paper focuses on the feature-based visual-inertial odometry (VIO) in dynamic illumination environments. While the performance of most existing feature-based VIO methods is degraded by the dynamic illumination, which leads to unstable feature association, we propose a tightly-coupled VIO algorithm termed RAFT-VINS, integrating a Lite-RAFT tracker into the visual inertial navigation system (VINS). The key module of this odometry algorithm is a lightweight optical flow network designed for accurate feature tracking with real-time operation. It guarantees robust feature association in dynamic illumination environments and thereby ensures the performance of the odometry. Besides, to further improve the accuracy of the pose estimation, a moving consistency check strategy is developed in RAFT-VINS to identify and remove the outlier feature points. Meanwhile, a tightly-coupled optimization-based framework is employed to fuse IMU and visual measurements in the sliding window for efficient and accurate pose estimation. Through comprehensive experiments in the public datasets and real-world scenarios, the proposed RAFT-VINS is validated for its capacity to provide trustable pose estimates in challenging dynamic illumination environments. Our codes are open-sourced on https://github.com/USTC-AIS-Lab/RAFT-VINS.