Please use this identifier to cite or link to this item:
Title: Flexible and resource-efficient multi-robot collaborative visual-inertial-range localization
Authors: Nguyen, Thien Hoang
Nguyen, Thien-Minh
Xie, Lihua
Keywords: Engineering::Electrical and electronic engineering
Issue Date: 2021
Source: Nguyen, T. H., Nguyen, T. & Xie, L. (2021). Flexible and resource-efficient multi-robot collaborative visual-inertial-range localization. IEEE Robotics and Automation Letters, 7(2), 928-935.
Journal: IEEE Robotics and Automation Letters
Abstract: In multi-robot systems, two important research problems are relative localization between the robots and global localization of all robots in a common frame. Traditional methods rely on detecting inter and intra-robot loop closures, which can be restrictive operation-wise since the robot must form loops. Ultra-wideband sensors, which provide direct distance measurements and robot ID, can replace loop closures in many applications. However, existing research on UWB-aided multi-robot state estimation often ignores the odometry drift which leads to inaccurate global position in the long run. In this work, we present a UWB-aided multi-robot localization system that does not rely on loop closure (flexible) and only requires odometry data from neighbors (resource-efficient). We propose a two-stage approach: 1) with a long sliding window, the relative transformation is refined based on range and odometry data, 2) onboard visual-inertial-range data are tightly fused in a short-term sliding window to provide more accurate local and global estimates. Simulation and real-life experiments with two quadrotors show that the system as a whole outperforms previous approaches as well as its individual parts.
ISSN: 2377-3766
DOI: 10.1109/LRA.2021.3136286
Rights: © 2021 IEEE. All rights reserved.
Fulltext Permission: none
Fulltext Availability: No Fulltext
Appears in Collections:EEE Journal Articles

Page view(s)

Updated on Dec 8, 2022

Google ScholarTM




Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.