skip to main content
research-article

HCCNet: Hybrid Coupled Cooperative Network for Robust Indoor Localization

Published: 08 July 2024 Publication History

Abstract

Accurate localization of unmanned aerial vehicle (UAV) is critical for navigation in GPS-denied regions, which remains a highly challenging topic in recent research. This article describes a novel approach to multi-sensor hybrid coupled cooperative localization network (HCCNet) system that combines multiple types of sensors including camera, ultra-wideband (UWB), and inertial measurement unit (IMU) to address this challenge. The camera and IMU can automatically determine the position of UAV based on the perception of surrounding environments and their own measurement data. The UWB node and the UWB wireless sensor network (WSN) in indoor environments jointly determine the global position of UAV, and the proposed dynamic random sample consensus (D-RANSAC) algorithm can optimize UWB localization accuracy. To fully exploit UWB localization results, we provide an HCCNet system which combines the local pose estimator of visual inertial odometry (VIO) system with global constraints from UWB localization results. Experimental results show that the proposed D-RANSAC algorithm can achieve better accuracy than other UWB-based algorithms. The effectiveness of the proposed HCCNet method is verified by a mobile robot in real world and some simulation experiments in indoor environments.

References

[1]
Muhammad Atif, Rizwan Ahmad, Waqas Ahmad, Liang Zhao, and Joel J. P. C. Rodrigues. 2021. UAV-assisted wireless localization for search and rescue. IEEE Syst. J. 15, 3 (Feb.2021), 3261–3272.
[2]
Michael Bloesch, Sammy Omari, Marco Hutter, and Roland Siegwart. 2015. Robust visual inertial odometry using a direct EKF-based approach. In Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst. (IROS’15). 298–304.
[3]
Stefano Bottigliero, Daniele Milanesio, Maurice Saccani, and Riccardo Maggiora. 2021. A low-cost indoor real-time locating system based on TDOA estimation of UWB pulse sequences. IEEE Trans. Instrum. Meas. 70 (2021), 1–11.
[4]
Yanjun Cao and Giovanni Beltrame. 2021. VIR-SLAM: Visual, inertial, and ranging SLAM for single and multi-robot systems. Auto. Robots 45, 6 (Sep.2021), 905–917.
[5]
Hong-Beom Choi, Keun-Woo Lim, and Young-Bae Ko. 2023. LUVI: Lightweight UWB-VIO based relative positioning for AR-IoT applications. Ad Hoc Netw. 145 (2023), 103132.
[6]
Martin A. Fischler and Robert C. Bolles. 1981. Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM 24, 6 (Jun.1981), 381–395.
[7]
Bo Gao, Baowang Lian, Dongjia Wang, and Chengkai Tang. 2022. Low drift visual inertial odometry with UWB aided for indoor localization. IET Commun. 16, 10 (Jun.2022), 1083–1093.
[8]
Zheng Gong, Peilin Liu, Fei Wen, Rendong Ying, Xingwu Ji, Ruihang Miao, and Wuyang Xue. 2021. Graph-based adaptive fusion of GNSS and VIO under intermittent GNSS-degraded environment. IEEE Trans. Instrum. Meas. 70 (2021), 1–16.
[9]
KwangYik Jung, SungJae Shin, and Hyun Myung. 2022. U-VIO: Tightly coupled UWB visual inertial odometry for robust localization. In Proceedings of the International Conference on Robot Intelligence Technology and Applications (RiTA’22), Vol. 429. Springer, 272–283.
[10]
Peng-Yuan Kao, Hsiu-Jui Chang, Kuan-Wei Tseng, Timothy Chen, He-Lin Luo, and Yi-Ping Hung. 2023. VIUNet: Deep visual–inertial–UWB fusion for indoor UAV localization. IEEE Access 11 (2023), 61525–61534.
[11]
Fabrizio Lazzari, Alice Buffi, Paolo Nepa, and Sandro Lazzari. 2017. Numerical investigation of an UWB localization technique for unmanned aerial vehicles in outdoor scenarios. IEEE Sensors J. 17, 9 (May2017), 2896–2903.
[12]
Stefan Leutenegger, Paul Furgale, Vincent Rabaud, Margarita Chli, Kurt Konolige, and Roland Siegwart. 2013. Keyframe-based visual-inertial SLAM using nonlinear optimization. In Proc. Robot., Sci. Syst.
[13]
Stefan Leutenegger, Simon Lynen, Michael Bosse, Roland Siegwart, and Paul Furgale. 2015. Keyframe-based visual-inertial odometry using nonlinear optimization. Int. J. Robot. Res. 34, 3 (2015), 314–334.
[14]
Danyang Li, Jingao Xu, Zheng Yang, Chenshu Wu, Jianbo Li, and Nicholas D Lane. 2021. Wireless localization with spatial-temporal robust fingerprints. ACM Trans. Sensor Netw. 18, 1 (2021), 1–23.
[15]
Huei-Yung Lin and Jia-Rong Zhan. 2023. GNSS-denied UAV indoor navigation with UWB incorporated visual inertial odometry. Measurement 206 (2023), 112256.
[16]
Kangcheng Liu and Ben M. Chen. 2023. Industrial UAV-based unsupervised domain adaptive crack recognitions: From database towards real-site infrastructural inspections. IEEE Trans. Ind. Electron. 70, 9 (Sep.2023), 9410–9420.
[17]
Ran Liu, Yongping He, Chau Yuen, Billy Pik Lik Lau, Rashid Ali, Wenpeng Fu, and Zhiqiang Cao. 2021. Cost-effective mapping of mobile robot based on the fusion of UWB and short-range 2-D LiDAR. IEEE/ASME Trans. Mechatronics 27, 3 (Jun.2021), 1321–1331.
[18]
Ran Liu, Chau Yuen, Tri-Nhut Do, Meng Zhang, Yong Liang Guan, and U-Xuan Tan. 2020. Cooperative positioning for emergency responders using self IMU and peer-to-peer radios measurements. Inf. Fusion 56 (Apr.2020), 93–102.
[19]
Tianxia Liu, Bofeng Li, Guang’e Chen, Ling Yang, Jing Qiao, and Wu Chen. 2024. Tightly coupled integration of GNSS/UWB/VIO for reliable and seamless positioning. IEEE Trans. Intell. Transp. Syst. 25, 2 (2024), 2116–2128.
[20]
Valerio Magnago, Pablo Corbalán, Gian Pietro Picco, Luigi Palopoli, and Daniele Fontanelli. 2019. Robot localization via odometry-assisted ultra-wideband ranging with stochastic guarantees. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots Systems (IROS’19). 1607–1613.
[21]
Ruben Mascaro, Lucas Teixeira, Timo Hinzmann, Roland Siegwart, and Margarita Chli. 2018. GOMSF: Graph-optimization based multi-sensor fusion for robust UAV pose estimation. In Proceedings of the International Conference on Robotics and Automation (ICRA’18). 1421–1428.
[22]
James McCoy, Atul Rawal, Danda B. Rawat, and Brian M. Sadler. 2022. Ensemble deep learning for sustainable multimodal UAV classification. IEEE Trans. Intell. Transp. Syst. (2022). doi:10.1109/TITS.2022.3170643.
[23]
Anastasios I. Mourikis and Stergios I. Roumeliotis. 2007. A multi-state constraint Kalman filter for vision-aided inertial navigation. In Proceedings of the International Conference on Robotics and Automation (ICRA’07). 3565–3572.
[24]
Raul Mur-Artal and Juan D. Tardós. 2017. ORB-SLAM2: An open-source SLAM system for monocular, stereo, and RGB-D cameras. IEEE Trans. Robot. 33, 5 (Oct.2017), 1255–1262.
[25]
Raúl Mur-Artal and Juan D. Tardós. 2017. Visual-inertial monocular SLAM with map reuse. IEEE Robot. Automat. Lett. 2, 2 (Apr.2017), 796–803.
[26]
Thien Hoang Nguyen, Thien-Minh Nguyen, Muqing Cao, and Lihua Xie. 2020. Loosely-coupled ultra-wideband-aided scale correction for monocular visual odometry. Unmanned Syst. 8, 02 (2020), 179–190.
[27]
Thien Hoang Nguyen, Thien-Minh Nguyen, and Lihua Xie. 2020. Tightly-coupled single-anchor ultra-wideband-aided monocular visual odometry system. In Proceedings of the International Conference on Robotics and Automation (ICRA’20). 665–671.
[28]
Thien Hoang Nguyen, Thien-Minh Nguyen, and Lihua Xie. 2021. Range-focused fusion of camera-IMU-UWB for accurate and drift-reduced localization. IEEE Robot. Automat. Lett. 6, 2 (Apr.2021), 1678–1685.
[29]
Thien-Minh Nguyen, Thien Hoang Nguyen, Muqing Cao, Zhirong Qiu, and Lihua Xie. 2019. Integrated UWB-vision approach for autonomous docking of UAVs in GPS-denied environments. In Proceedings of the International Conference on Robotics and Automation (ICRA’19). 9603–9609.
[30]
Edwin Olson. 2011. AprilTag: A robust and flexible visual fiducial system. In Proceedings of the International Conference on Robotics and Automation (ICRA’11). 3400–3407.
[31]
Francisco Javier Perez-Grau, Fernando Caballero, Luis Merino, and Antidio Viguria. 2017. Multi-modal mapping and localization of unmanned aerial robots based on ultra-wideband and RGB-D sensing. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots Systems (IROS’17). 3495–3502.
[32]
Tong Qin, Shaozu Cao, Jie Pan, and Shaojie Shen. 2019. A general optimization-based framework for global pose estimation with multiple sensors. arXiv:1901.03642 (2019).
[33]
Tong Qin, Peiliang Li, and Shaojie Shen. 2018. VINS-Mono: A robust and versatile monocular visual-inertial state estimator. IEEE Trans. Robot. 34, 4 (Aug.2018), 1004–1020.
[34]
Jorge Peña Queralta, Carmen Martínez Almansa, Fabrizio Schiano, Dario Floreano, and Tomi Westerlund. 2020. UWB-based system for UAV localization in GNSS-denied environments: Characterization and dataset. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots Systems (IROS’20). 4521–4528.
[35]
Priya Roy and Chandreyee Chowdhury. 2022. A survey on ubiquitous WiFi-based indoor localization system for smartphone users from implementation perspectives. CCF Trans. Pervasive Comput. Interact. 4, 3 (Jan.2022), 298–318.
[36]
Dinesh Kumar Sah, Tu N. Nguyen, Manjusha Kandulna, Korhan Cengiz, and Tarachand Amgoth. 2022. 3D localization and error minimization in underwater sensor networks. ACM Trans. Sensor Netw. 18, 3 (2022), 1–25.
[37]
Cung Lian Sang, Michael Adams, Marc Hesse, and Ulrich Rückert. 2023. Bidirectional UWB localization: A review on an elastic positioning scheme for GNSS-deprived zones. IEEE J. Indoor Seaml. Posit. Navig. 1 (2023), 161–179.
[38]
Yang Song and Li-Ta Hsu. 2021. Tightly coupled integrated navigation system via factor graph for UAV indoor localization. Aerosp. Sci. Technol. 108, 106370 (Jan.2021).
[39]
Pieter van Goor and Robert Mahony. 2023. EqVIO: An equivariant filter for visual-inertial odometry. IEEE Trans. Robot. (2023). doi:10.1109/TRO.2023.3289587.
[40]
Chen Wang, Handuo Zhang, Thien-Minh Nguyen, and Lihua Xie. 2017. Ultra-wideband aided fast localization and mapping system. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots Systems (IROS’17). 1602–1609.
[41]
Jun Wang, Pengfei Gu, Lei Wang, and Ziyang Meng. 2024. RVIO: An effective localization algorithm for range-aided visual-inertial odometry system. IEEE Trans. Intell. Transp. Syst. 25, 2 (Feb.2024), 1476–1490.
[42]
Yingying Wang, Hu Cheng, and Max Q.-H. Meng. 2022. A2DIO: Attention-driven deep inertial odometry for pedestrian localization based on 6D IMU. In Proceedings of the International Conference on Robotics and Automation (ICRA’22). 819–825.
[43]
Junyu Wei, Haowen Wang, Shaojing Su, Ying Tang, Xiaojun Guo, and Xiaoyong Sun. 2022. NLOS identification using parallel deep learning model and time-frequency information in UWB-based positioning system. Measurement 195 (May2022), 111191.
[44]
Jingao Xu, Erqun Dong, Qiang Ma, Chenshu Wu, and Zheng Yang. 2021. Smartphone-based indoor visual navigation with leader-follower mode. ACM Trans. Sensor Netw. 17, 2 (2021), 1–22.
[45]
Bo Yang, Jun Li, and Hong Zhang. 2021. Resilient indoor localization system based on UWB and visual-inertial sensors for complex environments. IEEE Trans. Instrum. Meas. 70 (2021), 1–14.
[46]
Junjie Yin, Zheng Yang, Sicong Liao, Chunhui Duan, Xuan Ding, and Li Zhang. 2023. TagFocus: Towards fine-grained multi-object identification in RFID-based systems with visual aids. ACM Trans. Sensor Netw. 19, 1 (2023), 1–22.
[47]
Di Yuan, Xiaojun Chang, Zhihui Li, and Zhenyu He. 2022. Learning adaptive spatial-temporal context-aware correlation filters for UAV tracking. ACM Trans. Multimedia Comput. Commun. Appl. 18, 3 (2022), 1–18.
[48]
Faheem Zafari, Athanasios Gkelias, and Kin K. Leung. 2019. A survey of indoor localization systems and technologies. IEEE Commun. Surveys Tuts. 21, 3 (3rd Quarter.2019), 2568–2599.
[49]
Shibin Zhao, Ji Li, and Yanbo Zhang. 2023. Loosely coupled localization system based on multi-base station UWB and point-line VIO. In Proceedings of the International Conference on Electronic Information Engineering and Computer Science (EIECS’23). IEEE, 136–142.
[50]
S. Zheng, Z. Li, Y. Liu, H. Zhang, P. Zheng, X. Liang, Y. Li, X. Bu, and X. Zou. 2022. UWB-VIO fusion for accurate and robust relative localization of round robotic teams. IEEE Robot. Automat. Lett. 7, 4 (Oct.2022), 11950–11957.

Index Terms

  1. HCCNet: Hybrid Coupled Cooperative Network for Robust Indoor Localization

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Transactions on Sensor Networks
    ACM Transactions on Sensor Networks  Volume 20, Issue 4
    July 2024
    603 pages
    EISSN:1550-4867
    DOI:10.1145/3618082
    • Editor:
    • Wen Hu
    Issue’s Table of Contents

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Journal Family

    Publication History

    Published: 08 July 2024
    Online AM: 27 May 2024
    Accepted: 29 April 2024
    Revised: 28 March 2024
    Received: 27 August 2023
    Published in TOSN Volume 20, Issue 4

    Check for updates

    Author Tags

    1. Indoor localization
    2. VIO
    3. UWB
    4. dynamic RANSAC
    5. hybrid coupling
    6. multi-sensor network

    Qualifiers

    • Research-article

    Funding Sources

    • National Science Foundation of China
    • National Key Research and Development Program of China

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 358
      Total Downloads
    • Downloads (Last 12 months)358
    • Downloads (Last 6 weeks)38
    Reflects downloads up to 25 Jan 2025

    Other Metrics

    Citations

    View Options

    Login options

    Full Access

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Full Text

    View this article in Full Text.

    Full Text

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media