Hostname: page-component-cd9895bd7-dzt6s Total loading time: 0 Render date: 2024-12-26T07:12:10.327Z Has data issue: false hasContentIssue false

Minimum Sigma Set SR-UKF for Quadrifocal Tensor-based Binocular Stereo Vision-IMU Tightly-coupled System

Published online by Cambridge University Press:  13 June 2018

Maosong Wang*
Affiliation:
(National University of Defense Technology, China) (The University of Calgary, Canada)
Wenqi Wu
Affiliation:
(National University of Defense Technology, China)
Naser El-Sheimy
Affiliation:
(The University of Calgary, Canada)
Zhiwen Xian
Affiliation:
(Taiyuan Satellite Launch Center, China)

Abstract

This paper presents a binocular vision-IMU (Inertial Measurement Unit) tightly-coupled structure based on a Minimum sigma set Square-Root Unscented Kalman Filter (M-SRUKF) for real time navigation applications. Though the M-SRUKF has only half the sigma points of the SRUKF, it has the same accuracy as the SRUKF when applied to the binocular vision-IMU tightly-coupled system. As the Kalman filter flow is a kind of square-root system, the stability of the system can be guaranteed. The measurement model and the outlier rejection model of this tightly-coupled system not only utilises the epipolar constraint and the trifocal tensor geometry constraint between the consecutive two image pairs, but also uses the quadrifocal tensor geometry among four views. The structure of the binocular vision-IMU tightly-coupled system is in the form of an error state, and the time updates of the state and the state covariance are directly estimated without using Unscented Transformation (UT). Experiments are carried out based on an outdoor land vehicle open source dataset and an indoor Micro Aerial Vehicle (MAV) open source dataset. Results clearly show the effectiveness of the proposed new mechanisation.

Type
Research Article
Copyright
Copyright © The Royal Institute of Navigation 2018 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

REFERENCES

Asadi, E. and Bottasso, C. L. (2012). Tightly-coupled vision-aided inertial navigation via trifocal constraints. IEEE International Conference on Robotics and Biomimetics, 8590.Google Scholar
Asadi, E. and Bottasso, C. L. (2014). Tightly-coupled stereo vision-aided inertial navigation using feature-based motion sensors. Advanced Robotics, 28(11), 717729.Google Scholar
Burri, M., Nikolic, J., Gohl, P., Schneider, T., Rehder, J., Omari, S., Achtelik, M. W. and Siegwart, R. (2016). The EUROC micro aerial vehicle datasets. International Journal of Robotics Research, 35(10), 11571163.Google Scholar
Corke, P., Lobo, J. and Dias, J. (2007). An introduction to inertial and visual sensing. International Journal of Robotics Research, 26(6), 519535.Google Scholar
Fang, Q. and Huang, S. X. (2013). UKF for Integrated Vision and Inertial Sensors Based on Three-View Geometry. IEEE Sensors Journal, 13(7), 27112719.Google Scholar
Faugeras, O. and Mourrain, B. (1995, June). On the geometry and algebra of the point and line correspondences between n images. IEEE International Conference on Computer Vision (pp. 951956).Google Scholar
Feng, G., Wu, W. and Wang, J. (2012). Observability analysis of a matrix Kalman filter-based navigation system using visual/inertial/magnetic sensors. Sensors, 12(7), 88778894.Google Scholar
Geiger, A., Lenz, P. and Urtasun, R. (2012). Are we ready for autonomous driving? The KITTI vision benchmark suite. IEEE Conference on Computer Vision and Pattern Recognition, 33543361.Google Scholar
Hartley, R. (1998). Computation of the quadrifocal tensor. European Conference on Computer Vision, 2035.Google Scholar
Hartley, R. and Zisserman, A. (2003). Multiple view geometry in computer vision. Cambridge University Press.Google Scholar
Hesch, J. A., Kottas, D. G., Bowman, S. L. and Roumeliotis, S. I. (2014). Camera-IMU-based localization: Observability analysis and consistency improvement. The International Journal of Robotics Research, 33(1), 182201.Google Scholar
Hu, J. S. and Chen, M. Y. (2014). A sliding-window visual-IMU odometer based on tri-focal tensor geometry. IEEE International Conference on Robotics and Automation (pp. 3963–3968).Google Scholar
Huang, Y., Zhang, Y., Li, N. and Chambers, J. (2016). A robust Gaussian approximate fixed-interval smoother for nonlinear systems with heavy-tailed process and measurement noises. IEEE Signal Processing Letters, 23(4), 468472.Google Scholar
Huang, Y., Zhang, Y., Li, N. and Chambers, J. (2016). Robust Student'st based nonlinear filter and smoother. IEEE Transactions on Aerospace and Electronic Systems, 52(5), 25862596.Google Scholar
Indelman, V., Gurfil, P., Rivlin, E., Rotstein, H. (2012). Real-Time Vision-Aided Localization and Navigation Based on Three-View Geometry. IEEE Transactions on Aerospace and Electronic Systems, 48(3), 22392259.Google Scholar
Julier, S. J. (2003). The spherical simplex unscented transformation. Proceedings of the 2003 American Control Conference (pp. 2430–2434).Google Scholar
Julier, S. J. and Uhlmann, J. K. (2002). Reduced sigma point filters for the propagation of means and covariances through nonlinear transformations. Proceedings of the 2002 American Control Conference (pp. 887–892).Google Scholar
Kelly, J. and Sukhatme, G. S. (2011). Visual-inertial sensor fusion: Localization, mapping and sensor-to-sensor self-calibration. The International Journal of Robotics Research, 30(1), 5679.Google Scholar
Kitt, B., Geiger, A. and Lategahn, H. (2010). Visual odometry based on stereo image sequences with ransac-based outlier rejection scheme. IEEE Intelligent Vehicles Symposium (pp. 486–492).Google Scholar
Kong, X., Wu, W., Zhang, L. and Wang, Y. (2015). Tightly-coupled stereo visual-inertial navigation using point and line features. Sensors, 15(6), 1281612833.Google Scholar
Kottas, D. G. and Roumeliotis, S. I. (2013). Efficient and consistent vision-aided inertial navigation using line observations. IEEE International Conference on Robotics and Automation (pp. 1540–1547).Google Scholar
Li, W. C., Wei, P., and Xiao, X. C. (2007). A novel simplex unscented transform and filter. IEEE International Symposium on Communications and Information Technologies, 61–65.Google Scholar
Liu, F., Sarvrood, Y. B. and Gao, Y. (2018). Implementation and Analysis of Tightly Integrated INS/Stereo VO for Land Vehicle Navigation. The Journal of Navigation, 71(1), 8399.Google Scholar
Menegaz, H. M. T., Ishihara, J. Y., Borges, G. A., and Vargas, A. N. (2015). A Systematization of the Unscented Kalman Filter Theory. IEEE Transactions on Automatic Control, 60(10), 25832598.Google Scholar
Menegaz, H. M., Ishihara, J. Y. and Borges, G. A. (2011). A new smallest sigma set for the Unscented Transform and its applications on SLAM. IEEE Conference on Decision and Control and European Control Conference.Google Scholar
Merwe, R. V. d. and Wan, E. A. (2001). The square-root unscented Kalman filter for state and parameter-estimation. International Conference on Acoustics, Speech, and Signal Processing (pp. 3461–3464).Google Scholar
Mourikis, A. I. and Roumeliotis, S. I. (2007). A multi-state constraint Kalman filter for vision-aided inertial navigation. IEEE international conference on Robotics and automation (pp. 3565–3572).Google Scholar
Rosten, E., and Drummond, T. (2006). Machine learning for high-speed corner detection. European Conference on Computer Vision, 430–443.Google Scholar
Shi, J. (1994). Good features to track. IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 593–600.Google Scholar
Sirtkaya, S., Seymen, B., and Alatan, A. A. (2013). Loosely coupled Kalman filtering for fusion of visual odometry and inertial navigation. International Conference on Information Fusion (pp. 219–226).Google Scholar
Tardif, J. P., George, M., Laverne, M., Kelly, A., and Stentz, A. (2010). A new approach to vision-aided inertial navigation. IEEE/RSJ International Conference on Intelligent Robots and Systems (pp. 4161–4168).Google Scholar
Wang, M., Wu, W. and El-sheimy, N. (2017). Comparisons of SR-UKF family for a Visual-IMU Tightly-coupled System Based on Tri-focal Tensor Geometry. Proceedings of the ION GNSS + , 3088–3101.Google Scholar
Weiss, S., Achtelik, M. W., Lynen, S., Chli, M. and Siegwart, R. (2012). Real-time onboard visual-inertial state estimation and self-calibration of MAVs in unknown environments. IEEE International Conference on Robotics and Automation, 957–964.Google Scholar
Xian, Z., Hu, X. and Lian, J. (2015). Fusing Stereo Camera and Low-Cost Inertial Measurement Unit for Autonomous Navigation in a Tightly-Coupled Approach. The Journal of Navigation, 68(3), 434452.Google Scholar
Xian, Z., Lian, J., Shan, M., Zhang, L., He, X. and Hu, X. (2016). A square root unscented Kalman filter for multiple view geometry based stereo cameras/inertial navigation. International Journal of Advanced Robotic Systems, 13(5), 111.Google Scholar
Zhang, L. (2013). Line primitives and their applications in geometric computer vision. Universitätsbibliothek, Kiel.Google Scholar