Interval-Based Visual-LiDAR Sensor Fusion

Publikation: Beitrag in FachzeitschriftArtikelForschungPeer-Review

Autoren

Organisationseinheiten

Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Aufsatznummer9349119
Seiten (von - bis)1304 - 1311
Seitenumfang8
FachzeitschriftIEEE Robotics and Automation Letters
Jahrgang6
Ausgabenummer2
Frühes Online-Datum5 Feb. 2021
PublikationsstatusVeröffentlicht - Apr. 2021

Abstract

Since cameras and Light Detection and Ranging (LiDAR) sensors provide complementary information about the environment, it is beneficial for mobile robot localization to fuse their information by assigning distances measured by the LiDAR to visual features detected in the image. However, existing approaches neglect the uncertainty of the fused information or model it in an optimistic way (e.g. without taking extrinsic calibration errors into account). Since the actual distribution of errors during sensor fusion is often unknown, we assume to only know bounds (or intervals) enclosing the errors. Consequently, we propose to use interval analysis to propagate the error from the input sources to the fused information in a straightforward way. To show the applicability of our approach, we use the fused information for dead reckoning. Since interval analysis is used, the result of our approach are intervals that are guaranteed to enclose the robot's true pose. An evaluation using real data shows that we are indeed able to localize the robot in a guaranteed way. This enables us to detect faults of an established approach, which neglects the uncertainty of the fused information, in three out of ten cases.

ASJC Scopus Sachgebiete

Zitieren

Interval-Based Visual-LiDAR Sensor Fusion. / Voges, Raphael; Wagner, Bernardo.
in: IEEE Robotics and Automation Letters, Jahrgang 6, Nr. 2, 9349119, 04.2021, S. 1304 - 1311.

Publikation: Beitrag in FachzeitschriftArtikelForschungPeer-Review

Voges, R & Wagner, B 2021, 'Interval-Based Visual-LiDAR Sensor Fusion', IEEE Robotics and Automation Letters, Jg. 6, Nr. 2, 9349119, S. 1304 - 1311. https://doi.org/10.1109/lra.2021.3057572
Voges, R., & Wagner, B. (2021). Interval-Based Visual-LiDAR Sensor Fusion. IEEE Robotics and Automation Letters, 6(2), 1304 - 1311. Artikel 9349119. https://doi.org/10.1109/lra.2021.3057572
Voges R, Wagner B. Interval-Based Visual-LiDAR Sensor Fusion. IEEE Robotics and Automation Letters. 2021 Apr;6(2):1304 - 1311. 9349119. Epub 2021 Feb 5. doi: 10.1109/lra.2021.3057572
Voges, Raphael ; Wagner, Bernardo. / Interval-Based Visual-LiDAR Sensor Fusion. in: IEEE Robotics and Automation Letters. 2021 ; Jahrgang 6, Nr. 2. S. 1304 - 1311.
Download
@article{d53d00e9ed644d1cbe876c447e0813a8,
title = "Interval-Based Visual-LiDAR Sensor Fusion",
abstract = "Since cameras and Light Detection and Ranging (LiDAR) sensors provide complementary information about the environment, it is beneficial for mobile robot localization to fuse their information by assigning distances measured by the LiDAR to visual features detected in the image. However, existing approaches neglect the uncertainty of the fused information or model it in an optimistic way (e.g. without taking extrinsic calibration errors into account). Since the actual distribution of errors during sensor fusion is often unknown, we assume to only know bounds (or intervals) enclosing the errors. Consequently, we propose to use interval analysis to propagate the error from the input sources to the fused information in a straightforward way. To show the applicability of our approach, we use the fused information for dead reckoning. Since interval analysis is used, the result of our approach are intervals that are guaranteed to enclose the robot's true pose. An evaluation using real data shows that we are indeed able to localize the robot in a guaranteed way. This enables us to detect faults of an established approach, which neglects the uncertainty of the fused information, in three out of ten cases.",
keywords = "Sensor fusion, formal methods in robotics and automation, interval analysis, localization",
author = "Raphael Voges and Bernardo Wagner",
note = "Funding Information: Manuscript received October 14, 2020; accepted January 21, 2021. Date of publication February 5, 2021; date of current version February 22, 2021. This letter was recommended for publication by Associate Editor P. Vasseur and Editor E. Marchand upon evaluation of the reviewers{\textquoteright} comments. This work was supported by the German Research Foundation (DFG) as part of the Research Training Group i.c.sens [RTG 2159]. (Corresponding author: Raphael Voges.) The authors are with the Real Time Systems Group (RTS), Institute of Systems Engineering, Leibniz Universit{\"a}t Hannover, D-30167 Hannover, Germany (e-mail: voges@rts.uni-hannover.de; wagner@rts.uni-hannover.de).",
year = "2021",
month = apr,
doi = "10.1109/lra.2021.3057572",
language = "English",
volume = "6",
pages = "1304 -- 1311",
number = "2",

}

Download

TY - JOUR

T1 - Interval-Based Visual-LiDAR Sensor Fusion

AU - Voges, Raphael

AU - Wagner, Bernardo

N1 - Funding Information: Manuscript received October 14, 2020; accepted January 21, 2021. Date of publication February 5, 2021; date of current version February 22, 2021. This letter was recommended for publication by Associate Editor P. Vasseur and Editor E. Marchand upon evaluation of the reviewers’ comments. This work was supported by the German Research Foundation (DFG) as part of the Research Training Group i.c.sens [RTG 2159]. (Corresponding author: Raphael Voges.) The authors are with the Real Time Systems Group (RTS), Institute of Systems Engineering, Leibniz Universität Hannover, D-30167 Hannover, Germany (e-mail: voges@rts.uni-hannover.de; wagner@rts.uni-hannover.de).

PY - 2021/4

Y1 - 2021/4

N2 - Since cameras and Light Detection and Ranging (LiDAR) sensors provide complementary information about the environment, it is beneficial for mobile robot localization to fuse their information by assigning distances measured by the LiDAR to visual features detected in the image. However, existing approaches neglect the uncertainty of the fused information or model it in an optimistic way (e.g. without taking extrinsic calibration errors into account). Since the actual distribution of errors during sensor fusion is often unknown, we assume to only know bounds (or intervals) enclosing the errors. Consequently, we propose to use interval analysis to propagate the error from the input sources to the fused information in a straightforward way. To show the applicability of our approach, we use the fused information for dead reckoning. Since interval analysis is used, the result of our approach are intervals that are guaranteed to enclose the robot's true pose. An evaluation using real data shows that we are indeed able to localize the robot in a guaranteed way. This enables us to detect faults of an established approach, which neglects the uncertainty of the fused information, in three out of ten cases.

AB - Since cameras and Light Detection and Ranging (LiDAR) sensors provide complementary information about the environment, it is beneficial for mobile robot localization to fuse their information by assigning distances measured by the LiDAR to visual features detected in the image. However, existing approaches neglect the uncertainty of the fused information or model it in an optimistic way (e.g. without taking extrinsic calibration errors into account). Since the actual distribution of errors during sensor fusion is often unknown, we assume to only know bounds (or intervals) enclosing the errors. Consequently, we propose to use interval analysis to propagate the error from the input sources to the fused information in a straightforward way. To show the applicability of our approach, we use the fused information for dead reckoning. Since interval analysis is used, the result of our approach are intervals that are guaranteed to enclose the robot's true pose. An evaluation using real data shows that we are indeed able to localize the robot in a guaranteed way. This enables us to detect faults of an established approach, which neglects the uncertainty of the fused information, in three out of ten cases.

KW - Sensor fusion

KW - formal methods in robotics and automation

KW - interval analysis

KW - localization

UR - http://www.scopus.com/inward/record.url?scp=85100848343&partnerID=8YFLogxK

U2 - 10.1109/lra.2021.3057572

DO - 10.1109/lra.2021.3057572

M3 - Article

VL - 6

SP - 1304

EP - 1311

JO - IEEE Robotics and Automation Letters

JF - IEEE Robotics and Automation Letters

SN - 2377-3766

IS - 2

M1 - 9349119

ER -

Von denselben Autoren