Hostname: page-component-857557d7f7-qdphv Total loading time: 0 Render date: 2025-12-01T02:29:55.995Z Has data issue: false hasContentIssue false

A fast and accurate tightly coupled Light Detection and Ranging-inertial odometry based on sparse voxel maps and Gauss-Newton method

Published online by Cambridge University Press:  13 October 2025

Wei Xie
Affiliation:
Department of Automation Science and Engineering, South China University of Technology, Guangzhou, China
Zi Ding
Affiliation:
Department of Automation Science and Engineering, South China University of Technology, Guangzhou, China
Langwen Zhang*
Affiliation:
Department of Automation Science and Engineering, South China University of Technology, Guangzhou, China
Pengfei Lyu
Affiliation:
Department of Civil and Environmental Engineering, Kitami Institute of Technology, Kitami, Japan
*
Corresponding author: Langwen Zhang; E-mail: aulwzhang@scut.edu.cn

Abstract

This work proposes an optimization approach for the time-consuming parts of Light Detection and Ranging (LiDAR) data processing and IMU-LiDAR data fusion in the LiDAR-inertial odometry (LIO) method. Two key novelties enable faster and more accurate navigation in complex, noisy environments. Firstly, to improve map update and point cloud registration efficiency, we employ a sparse voxel maps with a new update function to construct a local map around the mobile robot and utilize an improved Generalized Iterative Closest Point algorithm based on sparse voxels to achieve LiDAR point clouds association, thereby boosting both map updating and computational speed. Secondly, to enhance real-time accuracy, this paper analyzes the residuals and covariances of both IMU and LiDAR data in a tightly coupled manner, and achieves system state estimation by fusing sensor information through Gauss-Newton method, effectively mitigating localization deviations by appropriately weighting the LiDAR covariances. The performance of our method is evaluated against advanced LIO algorithms using eight open datasets and five self-collected campus datasets. Results show a 24.7–60.1% reduction in average processing time per point cloud frame, along with improved robustness and higher precision motion trajectory estimation in most cluttered and complex indoor and outdoor environments.

Information

Type
Research Article
Copyright
© The Author(s), 2025. Published by Cambridge University Press

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Article purchase

Temporarily unavailable

References

Loo, S. Y., Amiri, A. J., Mashohor, S., Tang, S. H. and Zhang, H., “CNN-SVO: Improving the Mapping in Semi-Direct Visual Odometry Using Single-Image Depth Prediction,” 2019 International Conference on Robotics and Automation (ICRA) (2019) pp. 52185223.Google Scholar
Campos, C., Elvira, R., Rodríguez, J. J. G., Montiel, J. M. M. and Tardós, J. D., “ORB-SLAM3: An accurate open-source library for visual, visual–inertial, and multimap slam,” IEEE Trans. Rob. 37(6), 18741890 (2021).CrossRefGoogle Scholar
Fasiolo, D. T., Scalera, L. and Maset, E., “Comparing lidar and imu-based slam approaches for 3d robotic mapping,” Robotica 41(9), 25882604 (2023).CrossRefGoogle Scholar
Cao, C., Zhu, H., Choset, H. and Zhang, J., “Exploring Large and Complex Environments Fast and Efficiently,” 2021 IEEE International Conference on Robotics and Automation (ICRA) (2021) pp. 77817787.Google Scholar
Lin, J. and Zhang, F.a, “Loam Livox: A Fast, Robust, High-Precision Lidar Odometry and Mapping Package for Lidars of Small FoV,” 2020 IEEE International Conference on Robotics and Automation (ICRA) (2020) pp. 31263131.Google Scholar
Lee, D., Jung, M., Yang, W. and Kim, A., “LiDAR odometry survey: Recent advancements and remaining challenges,” Intell. Serv. Rob. 17(2), 95118 (2024).CrossRefGoogle Scholar
Smith, R. C. and Cheeseman, “, P. C. On the representation and estimation of spatial uncertainty,” Int. J. Rob. Res. 5(4), 5668 (1986).CrossRefGoogle Scholar
Hu, Y., Zhou, Q., Miao, Z., Yuan, H. and Liu, S., “Outdoor lidar-inertial slam using ground constraints,” Robotica 42(4), 12461261 (2024).CrossRefGoogle Scholar
J.F., S. and R., S., “Self-adaptive learning particle swarm optimization-based path planning of mobile robot using 2d lidar environment,” Robotica 42(4), 9771000 (2024).Google Scholar
S, S., M, M., M, P., J, S. M. and S, U. M., “Lidar Data and Visual SLAM Based Robot Navigation Using Deep Learning,” 2024 International Conference on Sustainable Communication Networks and Application (ICSCNA) (2024) pp. 17301735.Google Scholar
Gutmann, J.-S. and Konolige, K., “Incremental Mapping of Large Cyclic Environments,” Proceedings 1999 IEEE International Symposium on Computational Intelligence in Robotics and Automation. CIRA’99 (Cat. No.99EX375) (1999) pp. 318325.Google Scholar
Cadena, C., Carlone, L., Carrillo, H., Latif, Y., Scaramuzza, D., Neira, J., Reid, I. and Leonard, J. J., “Past, present, and future of simultaneous localization and mapping: Toward the robust-perception age,” IEEE Trans. Rob. 32(6), 13091332 (2016).CrossRefGoogle Scholar
Guttman, A., “R-trees: A dynamic index structure for spatial searching,” SIGMOD Rec. 14(2), 4757 (1984).CrossRefGoogle Scholar
Bentley, J. L., “Multidimensional binary search trees used for associative searching,” Commun. ACM 18(9), 509517 (1975).CrossRefGoogle Scholar
Beckmann, N., Kriegel, H.-P., Schneider, R. and Seeger, B., “The R*-tree: An efficient and robust access method for points and rectangles,” SIGMOD Rec. 19(2), 322331 (1990).CrossRefGoogle Scholar
Xu, W., Cai, Y., He, D., Lin, J. and Zhang, F., “FAST-LIO2: Fast direct lidar-inertial odometry,” IEEE Trans. Rob. 38(4), 20532073 (2022).CrossRefGoogle Scholar
Yokozuka, M., Koide, K., Oishi, S. and Banno, A., “LiTAMIN: Lidar-based Tracking and Mapping by Stabilized ICP for Geometry Approximation with Normal Distributions,” 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (2020) pp. 51435150.Google Scholar
Bai, C., Xiao, T., Chen, Y., Wang, H., Zhang, F. and Gao, X., “Faster-LIO: Lightweight tightly coupled lidar-inertial odometry using parallel sparse incremental voxels,” IEEE Rob. Autom. Lett. 7(2), 48614868 (2022).CrossRefGoogle Scholar
Karimi, M., Oelsch, M., Stengel, O., Babaians, E. and Steinbach, E., “LoLa-SLAM: Low-latency lidar slam using continuous scan slicing,” IEEE Rob. Autom. Lett. 6(2), 22482255 (2021).CrossRefGoogle Scholar
Besl, P. and McKay, N. D., “A method for registration of 3-d shapes,” IEEE Trans. Pattern Anal. Mach. Intell. 14(2), 239256 (1992).CrossRefGoogle Scholar
Biber, P. and Strasser, W., “The Normal Distributions Transform: A New Approach to Laser Scan Matching,” Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003) (Cat. No.03CH37453), vol. 3 (2003) pp. 27432748.CrossRefGoogle Scholar
Zhang, J. and Singh, S., “LOAM: Lidar Odometry and Mapping in Real-Time,” In: Robotics: Science and Systems, vol. 2 (2014) pp. 19. Berkeley, CA.Google Scholar
Li, K., Li, M. and Hanebeck, U. D., “Towards high-performance solid-state-LiDAR-inertial odometry and mapping,” IEEE Rob. Autom. Lett. 6(3), 51675174 (2021).CrossRefGoogle Scholar
Gentil, C. L., Vidal-Calleja, T. and Huang, S., “IN2LAMA: Inertial Lidar Localisation and Mapping,” 2019 International Conference on Robotics and Automation (ICRA) (2019) pp. 63886394.Google Scholar
Shan, T., Englot, B., Meyers, D., Wang, W., Ratti, C. and Rus, D., “LIO-SAM: Tightly-Coupled Lidar Inertial Odometry via Smoothing and Mapping,” 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (2020) pp. 51355142.Google Scholar
Xu, W. and Zhang, F., “FAST-LIO: A fast, robust lidar-inertial odometry package by tightly-coupled iterated kalman filter,” IEEE Rob. Autom. Lett. 6(2), 33173324 (2021).CrossRefGoogle Scholar
Hertzberg, C., Wagner, R., Frese, U. and Schröder, L., “Integrating generic sensor fusion algorithms with sound state representations through encapsulation of manifolds,” Inf. Fusion 14(1), 5777 (2013).CrossRefGoogle Scholar
Segal, A., Hähnel, D. and Thrun, S., Generalized-ICP (2009).Google Scholar
Koide, K., Yokozuka, M., Oishi, S. and Banno, A., “Voxelized GICP for Fast and Accurate 3D Point Cloud Registration,” International Conference on Robotics and Automation (IEEE Press, 2021) pp. 1105411059.CrossRefGoogle Scholar
O’Neil, E. J., O’Neil, P. E. and Weikum, G., “The LRU-K page replacement algorithm for database disk buffering,” SIGMOD Rec. 22(2), 297306 (1993).CrossRefGoogle Scholar
Liu, Y., Fu, Y., Qin, M., Xu, Y., Xu, B., Chen, F., Goossens, B., Sun, P. Z., Yu, H., Liu, C., Chen, L., Tao, W. and Zhao, H., “BotanicGarden: A high-quality dataset for robot navigation in unstructured natural environments,” IEEE Rob. Autom. Lett. 9(3), 27982805 (2024).CrossRefGoogle Scholar
Lin, J. and Zhang, F., “R3LIVE: A Robust, Real-Time, RGB-Colored, LiDAR-Inertial-Visual Tightly-Coupled State Estimation and Mapping Package,” 2022 International Conference on Robotics and Automation (ICRA) (2022) pp. 1067210678.Google Scholar
Carlevaris-Bianco, N., Ushani, A. K. and Eustice, R. M., “University of Michigan North Campus long-term vision and lidar dataset,” Int. J. Rob. Res. 35(9), 10231035 (2016).CrossRefGoogle Scholar