Stereo-Visual-LiDAR Sensor Fusion Using Set-Membership Methods

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Autoren

Organisationseinheiten

Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Titel des Sammelwerks2021 IEEE 17th International Conference on Automation Science and Engineering (CASE)
Seiten1132-1139
Seitenumfang8
ISBN (elektronisch)9781665418737
PublikationsstatusVeröffentlicht - 2021
Veranstaltung17th International Conference on Automation Science and Engineering (CASE) - Lyon, Frankreich
Dauer: 23 Aug. 202127 Aug. 2021
Konferenznummer: 17

Publikationsreihe

NameIEEE International Conference on Automation Science and Engineering
Band2021-August
ISSN (Print)2161-8070
ISSN (elektronisch)2161-8089

Abstract

Taking advantage of the complementary error characteristics of Light Detection and Ranging (LiDAR) and stereo camera reconstruction, we propose a set-membership-based method for fusing LiDAR information with dense stereo data under consideration of interval uncertainty of all measurements and calibration parameters. Employing interval analysis, we can propagate the uncertainties to the extraction of distinct features in a straightforward manner. To show the applicability of our approach, we use the fused information for dead reckoning. In contrast to other works, we can consistently propagate the sensor uncertainties to the localization of the robot. Further, we can provide guaranteed bounds for the relative motion between consecutive frames. Using real data we validate that our approach is indeed able to always enclose the true pose of the robot.

ASJC Scopus Sachgebiete

Zitieren

Stereo-Visual-LiDAR Sensor Fusion Using Set-Membership Methods. / Ehambram, Aaronkumar; Voges, Raphael; Wagner, Bernardo.
2021 IEEE 17th International Conference on Automation Science and Engineering (CASE). 2021. S. 1132-1139 (IEEE International Conference on Automation Science and Engineering; Band 2021-August).

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Ehambram, A, Voges, R & Wagner, B 2021, Stereo-Visual-LiDAR Sensor Fusion Using Set-Membership Methods. in 2021 IEEE 17th International Conference on Automation Science and Engineering (CASE). IEEE International Conference on Automation Science and Engineering, Bd. 2021-August, S. 1132-1139, 17th International Conference on Automation Science and Engineering (CASE), Lyon, Frankreich, 23 Aug. 2021. https://doi.org/10.1109/case49439.2021.9551516
Ehambram, A., Voges, R., & Wagner, B. (2021). Stereo-Visual-LiDAR Sensor Fusion Using Set-Membership Methods. In 2021 IEEE 17th International Conference on Automation Science and Engineering (CASE) (S. 1132-1139). (IEEE International Conference on Automation Science and Engineering; Band 2021-August). https://doi.org/10.1109/case49439.2021.9551516
Ehambram A, Voges R, Wagner B. Stereo-Visual-LiDAR Sensor Fusion Using Set-Membership Methods. in 2021 IEEE 17th International Conference on Automation Science and Engineering (CASE). 2021. S. 1132-1139. (IEEE International Conference on Automation Science and Engineering). doi: 10.1109/case49439.2021.9551516
Ehambram, Aaronkumar ; Voges, Raphael ; Wagner, Bernardo. / Stereo-Visual-LiDAR Sensor Fusion Using Set-Membership Methods. 2021 IEEE 17th International Conference on Automation Science and Engineering (CASE). 2021. S. 1132-1139 (IEEE International Conference on Automation Science and Engineering).
Download
@inproceedings{d3bf616f558c4c4e8e31262c2cc42b4a,
title = "Stereo-Visual-LiDAR Sensor Fusion Using Set-Membership Methods",
abstract = "Taking advantage of the complementary error characteristics of Light Detection and Ranging (LiDAR) and stereo camera reconstruction, we propose a set-membership-based method for fusing LiDAR information with dense stereo data under consideration of interval uncertainty of all measurements and calibration parameters. Employing interval analysis, we can propagate the uncertainties to the extraction of distinct features in a straightforward manner. To show the applicability of our approach, we use the fused information for dead reckoning. In contrast to other works, we can consistently propagate the sensor uncertainties to the localization of the robot. Further, we can provide guaranteed bounds for the relative motion between consecutive frames. Using real data we validate that our approach is indeed able to always enclose the true pose of the robot.",
author = "Aaronkumar Ehambram and Raphael Voges and Bernardo Wagner",
note = "Funding Information: This work was supported by the German Research Foundation (DFG) as part of the Research Training Group i.c.sens [RTG 2159].; 17th International Conference on Automation Science and Engineering (CASE) ; Conference date: 23-08-2021 Through 27-08-2021",
year = "2021",
doi = "10.1109/case49439.2021.9551516",
language = "English",
isbn = "978-1-6654-4809-3",
series = "IEEE International Conference on Automation Science and Engineering",
pages = "1132--1139",
booktitle = "2021 IEEE 17th International Conference on Automation Science and Engineering (CASE)",

}

Download

TY - GEN

T1 - Stereo-Visual-LiDAR Sensor Fusion Using Set-Membership Methods

AU - Ehambram, Aaronkumar

AU - Voges, Raphael

AU - Wagner, Bernardo

N1 - Conference code: 17

PY - 2021

Y1 - 2021

N2 - Taking advantage of the complementary error characteristics of Light Detection and Ranging (LiDAR) and stereo camera reconstruction, we propose a set-membership-based method for fusing LiDAR information with dense stereo data under consideration of interval uncertainty of all measurements and calibration parameters. Employing interval analysis, we can propagate the uncertainties to the extraction of distinct features in a straightforward manner. To show the applicability of our approach, we use the fused information for dead reckoning. In contrast to other works, we can consistently propagate the sensor uncertainties to the localization of the robot. Further, we can provide guaranteed bounds for the relative motion between consecutive frames. Using real data we validate that our approach is indeed able to always enclose the true pose of the robot.

AB - Taking advantage of the complementary error characteristics of Light Detection and Ranging (LiDAR) and stereo camera reconstruction, we propose a set-membership-based method for fusing LiDAR information with dense stereo data under consideration of interval uncertainty of all measurements and calibration parameters. Employing interval analysis, we can propagate the uncertainties to the extraction of distinct features in a straightforward manner. To show the applicability of our approach, we use the fused information for dead reckoning. In contrast to other works, we can consistently propagate the sensor uncertainties to the localization of the robot. Further, we can provide guaranteed bounds for the relative motion between consecutive frames. Using real data we validate that our approach is indeed able to always enclose the true pose of the robot.

UR - http://www.scopus.com/inward/record.url?scp=85117044055&partnerID=8YFLogxK

U2 - 10.1109/case49439.2021.9551516

DO - 10.1109/case49439.2021.9551516

M3 - Conference contribution

SN - 978-1-6654-4809-3

T3 - IEEE International Conference on Automation Science and Engineering

SP - 1132

EP - 1139

BT - 2021 IEEE 17th International Conference on Automation Science and Engineering (CASE)

T2 - 17th International Conference on Automation Science and Engineering (CASE)

Y2 - 23 August 2021 through 27 August 2021

ER -

Von denselben Autoren