Interval-based Visual-Inertial LiDAR SLAM with Anchoring Poses

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Authors

View graph of relations

Details

Original languageEnglish
Title of host publication2022 IEEE International Conference on Robotics and Automation, ICRA 2022
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages7589-7596
Number of pages8
ISBN (electronic)9781728196817
Publication statusPublished - 2022
Event39th IEEE International Conference on Robotics and Automation, ICRA 2022 - Philadelphia, United States
Duration: 23 May 202227 May 2022

Publication series

NameProceedings - IEEE International Conference on Robotics and Automation
ISSN (Print)1050-4729

Abstract

We present a novel interval-based visual-inertial LiDAR SLAM (i-VIL SLAM) method that solely assumes sensor errors to be bounded and propagates the error from the input sources to the estimated map and trajectory using interval analysis. The method allows us to restrict the solution set of the robot poses and the position of the landmarks to the set that is consistent with the measurements. If the error limits are not violated, it is guaranteed that the estimated set contains the true solution. The accumulation of the uncertainty is stabilized by anchoring poses derived from GNSS/INS data. Furthermore, for the first time we compare confidence ellipses determined by a classical SLAM graph optimization approach with the interval estimates of the robot poses provided by our method. In this work, we experimentally show that the marginal co-variances computed by the classical SLAM graph optimization are too overconfident and underestimate the uncertainty of the poses. While the 99.9 %-ellipsoids derived from the marginal covariances of the poses only enclose less than 64 % of the ground truth in the worst case, our method provides interval bounds for the pose parameters that enclose the ground truth for more than 96 % of all frames.

ASJC Scopus subject areas

Cite this

Interval-based Visual-Inertial LiDAR SLAM with Anchoring Poses. / Ehambram, Aaronkumar; Voges, Raphael; Brenner, Claus et al.
2022 IEEE International Conference on Robotics and Automation, ICRA 2022. Institute of Electrical and Electronics Engineers Inc., 2022. p. 7589-7596 (Proceedings - IEEE International Conference on Robotics and Automation).

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Ehambram, A, Voges, R, Brenner, C & Wagner, B 2022, Interval-based Visual-Inertial LiDAR SLAM with Anchoring Poses. in 2022 IEEE International Conference on Robotics and Automation, ICRA 2022. Proceedings - IEEE International Conference on Robotics and Automation, Institute of Electrical and Electronics Engineers Inc., pp. 7589-7596, 39th IEEE International Conference on Robotics and Automation, ICRA 2022, Philadelphia, United States, 23 May 2022. https://doi.org/10.1109/ICRA46639.2022.9812425
Ehambram, A., Voges, R., Brenner, C., & Wagner, B. (2022). Interval-based Visual-Inertial LiDAR SLAM with Anchoring Poses. In 2022 IEEE International Conference on Robotics and Automation, ICRA 2022 (pp. 7589-7596). (Proceedings - IEEE International Conference on Robotics and Automation). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ICRA46639.2022.9812425
Ehambram A, Voges R, Brenner C, Wagner B. Interval-based Visual-Inertial LiDAR SLAM with Anchoring Poses. In 2022 IEEE International Conference on Robotics and Automation, ICRA 2022. Institute of Electrical and Electronics Engineers Inc. 2022. p. 7589-7596. (Proceedings - IEEE International Conference on Robotics and Automation). doi: 10.1109/ICRA46639.2022.9812425
Ehambram, Aaronkumar ; Voges, Raphael ; Brenner, Claus et al. / Interval-based Visual-Inertial LiDAR SLAM with Anchoring Poses. 2022 IEEE International Conference on Robotics and Automation, ICRA 2022. Institute of Electrical and Electronics Engineers Inc., 2022. pp. 7589-7596 (Proceedings - IEEE International Conference on Robotics and Automation).
Download
@inproceedings{f96e1fc685844c6fba09e7e460fcae6a,
title = "Interval-based Visual-Inertial LiDAR SLAM with Anchoring Poses",
abstract = "We present a novel interval-based visual-inertial LiDAR SLAM (i-VIL SLAM) method that solely assumes sensor errors to be bounded and propagates the error from the input sources to the estimated map and trajectory using interval analysis. The method allows us to restrict the solution set of the robot poses and the position of the landmarks to the set that is consistent with the measurements. If the error limits are not violated, it is guaranteed that the estimated set contains the true solution. The accumulation of the uncertainty is stabilized by anchoring poses derived from GNSS/INS data. Furthermore, for the first time we compare confidence ellipses determined by a classical SLAM graph optimization approach with the interval estimates of the robot poses provided by our method. In this work, we experimentally show that the marginal co-variances computed by the classical SLAM graph optimization are too overconfident and underestimate the uncertainty of the poses. While the 99.9 %-ellipsoids derived from the marginal covariances of the poses only enclose less than 64 % of the ground truth in the worst case, our method provides interval bounds for the pose parameters that enclose the ground truth for more than 96 % of all frames.",
author = "Aaronkumar Ehambram and Raphael Voges and Claus Brenner and Bernardo Wagner",
note = "Funding Information: This work was supported by the German Research Foundation (DFG) as part of the Research Training Group i.c.sens [RTG 2159].; 39th IEEE International Conference on Robotics and Automation, ICRA 2022 ; Conference date: 23-05-2022 Through 27-05-2022",
year = "2022",
doi = "10.1109/ICRA46639.2022.9812425",
language = "English",
series = "Proceedings - IEEE International Conference on Robotics and Automation",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
pages = "7589--7596",
booktitle = "2022 IEEE International Conference on Robotics and Automation, ICRA 2022",
address = "United States",

}

Download

TY - GEN

T1 - Interval-based Visual-Inertial LiDAR SLAM with Anchoring Poses

AU - Ehambram, Aaronkumar

AU - Voges, Raphael

AU - Brenner, Claus

AU - Wagner, Bernardo

N1 - Funding Information: This work was supported by the German Research Foundation (DFG) as part of the Research Training Group i.c.sens [RTG 2159].

PY - 2022

Y1 - 2022

N2 - We present a novel interval-based visual-inertial LiDAR SLAM (i-VIL SLAM) method that solely assumes sensor errors to be bounded and propagates the error from the input sources to the estimated map and trajectory using interval analysis. The method allows us to restrict the solution set of the robot poses and the position of the landmarks to the set that is consistent with the measurements. If the error limits are not violated, it is guaranteed that the estimated set contains the true solution. The accumulation of the uncertainty is stabilized by anchoring poses derived from GNSS/INS data. Furthermore, for the first time we compare confidence ellipses determined by a classical SLAM graph optimization approach with the interval estimates of the robot poses provided by our method. In this work, we experimentally show that the marginal co-variances computed by the classical SLAM graph optimization are too overconfident and underestimate the uncertainty of the poses. While the 99.9 %-ellipsoids derived from the marginal covariances of the poses only enclose less than 64 % of the ground truth in the worst case, our method provides interval bounds for the pose parameters that enclose the ground truth for more than 96 % of all frames.

AB - We present a novel interval-based visual-inertial LiDAR SLAM (i-VIL SLAM) method that solely assumes sensor errors to be bounded and propagates the error from the input sources to the estimated map and trajectory using interval analysis. The method allows us to restrict the solution set of the robot poses and the position of the landmarks to the set that is consistent with the measurements. If the error limits are not violated, it is guaranteed that the estimated set contains the true solution. The accumulation of the uncertainty is stabilized by anchoring poses derived from GNSS/INS data. Furthermore, for the first time we compare confidence ellipses determined by a classical SLAM graph optimization approach with the interval estimates of the robot poses provided by our method. In this work, we experimentally show that the marginal co-variances computed by the classical SLAM graph optimization are too overconfident and underestimate the uncertainty of the poses. While the 99.9 %-ellipsoids derived from the marginal covariances of the poses only enclose less than 64 % of the ground truth in the worst case, our method provides interval bounds for the pose parameters that enclose the ground truth for more than 96 % of all frames.

UR - http://www.scopus.com/inward/record.url?scp=85132962430&partnerID=8YFLogxK

U2 - 10.1109/ICRA46639.2022.9812425

DO - 10.1109/ICRA46639.2022.9812425

M3 - Conference contribution

AN - SCOPUS:85132962430

T3 - Proceedings - IEEE International Conference on Robotics and Automation

SP - 7589

EP - 7596

BT - 2022 IEEE International Conference on Robotics and Automation, ICRA 2022

PB - Institute of Electrical and Electronics Engineers Inc.

T2 - 39th IEEE International Conference on Robotics and Automation, ICRA 2022

Y2 - 23 May 2022 through 27 May 2022

ER -

By the same author(s)