Details
Original language | English |
---|---|
Title of host publication | 2022 IEEE International Conference on Robotics and Automation, ICRA 2022 |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 7589-7596 |
Number of pages | 8 |
ISBN (electronic) | 9781728196817 |
Publication status | Published - 2022 |
Event | 39th IEEE International Conference on Robotics and Automation, ICRA 2022 - Philadelphia, United States Duration: 23 May 2022 → 27 May 2022 |
Publication series
Name | Proceedings - IEEE International Conference on Robotics and Automation |
---|---|
ISSN (Print) | 1050-4729 |
Abstract
We present a novel interval-based visual-inertial LiDAR SLAM (i-VIL SLAM) method that solely assumes sensor errors to be bounded and propagates the error from the input sources to the estimated map and trajectory using interval analysis. The method allows us to restrict the solution set of the robot poses and the position of the landmarks to the set that is consistent with the measurements. If the error limits are not violated, it is guaranteed that the estimated set contains the true solution. The accumulation of the uncertainty is stabilized by anchoring poses derived from GNSS/INS data. Furthermore, for the first time we compare confidence ellipses determined by a classical SLAM graph optimization approach with the interval estimates of the robot poses provided by our method. In this work, we experimentally show that the marginal co-variances computed by the classical SLAM graph optimization are too overconfident and underestimate the uncertainty of the poses. While the 99.9 %-ellipsoids derived from the marginal covariances of the poses only enclose less than 64 % of the ground truth in the worst case, our method provides interval bounds for the pose parameters that enclose the ground truth for more than 96 % of all frames.
ASJC Scopus subject areas
- Computer Science(all)
- Software
- Engineering(all)
- Control and Systems Engineering
- Computer Science(all)
- Artificial Intelligence
- Engineering(all)
- Electrical and Electronic Engineering
Cite this
- Standard
- Harvard
- Apa
- Vancouver
- BibTeX
- RIS
2022 IEEE International Conference on Robotics and Automation, ICRA 2022. Institute of Electrical and Electronics Engineers Inc., 2022. p. 7589-7596 (Proceedings - IEEE International Conference on Robotics and Automation).
Research output: Chapter in book/report/conference proceeding › Conference contribution › Research › peer review
}
TY - GEN
T1 - Interval-based Visual-Inertial LiDAR SLAM with Anchoring Poses
AU - Ehambram, Aaronkumar
AU - Voges, Raphael
AU - Brenner, Claus
AU - Wagner, Bernardo
N1 - Funding Information: This work was supported by the German Research Foundation (DFG) as part of the Research Training Group i.c.sens [RTG 2159].
PY - 2022
Y1 - 2022
N2 - We present a novel interval-based visual-inertial LiDAR SLAM (i-VIL SLAM) method that solely assumes sensor errors to be bounded and propagates the error from the input sources to the estimated map and trajectory using interval analysis. The method allows us to restrict the solution set of the robot poses and the position of the landmarks to the set that is consistent with the measurements. If the error limits are not violated, it is guaranteed that the estimated set contains the true solution. The accumulation of the uncertainty is stabilized by anchoring poses derived from GNSS/INS data. Furthermore, for the first time we compare confidence ellipses determined by a classical SLAM graph optimization approach with the interval estimates of the robot poses provided by our method. In this work, we experimentally show that the marginal co-variances computed by the classical SLAM graph optimization are too overconfident and underestimate the uncertainty of the poses. While the 99.9 %-ellipsoids derived from the marginal covariances of the poses only enclose less than 64 % of the ground truth in the worst case, our method provides interval bounds for the pose parameters that enclose the ground truth for more than 96 % of all frames.
AB - We present a novel interval-based visual-inertial LiDAR SLAM (i-VIL SLAM) method that solely assumes sensor errors to be bounded and propagates the error from the input sources to the estimated map and trajectory using interval analysis. The method allows us to restrict the solution set of the robot poses and the position of the landmarks to the set that is consistent with the measurements. If the error limits are not violated, it is guaranteed that the estimated set contains the true solution. The accumulation of the uncertainty is stabilized by anchoring poses derived from GNSS/INS data. Furthermore, for the first time we compare confidence ellipses determined by a classical SLAM graph optimization approach with the interval estimates of the robot poses provided by our method. In this work, we experimentally show that the marginal co-variances computed by the classical SLAM graph optimization are too overconfident and underestimate the uncertainty of the poses. While the 99.9 %-ellipsoids derived from the marginal covariances of the poses only enclose less than 64 % of the ground truth in the worst case, our method provides interval bounds for the pose parameters that enclose the ground truth for more than 96 % of all frames.
UR - http://www.scopus.com/inward/record.url?scp=85132962430&partnerID=8YFLogxK
U2 - 10.1109/ICRA46639.2022.9812425
DO - 10.1109/ICRA46639.2022.9812425
M3 - Conference contribution
AN - SCOPUS:85132962430
T3 - Proceedings - IEEE International Conference on Robotics and Automation
SP - 7589
EP - 7596
BT - 2022 IEEE International Conference on Robotics and Automation, ICRA 2022
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 39th IEEE International Conference on Robotics and Automation, ICRA 2022
Y2 - 23 May 2022 through 27 May 2022
ER -