Stereo-Visual-LiDAR Sensor Fusion Using Set-Membership Methods

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Authors

Research Organisations

View graph of relations

Details

Original languageEnglish
Title of host publication2021 IEEE 17th International Conference on Automation Science and Engineering (CASE)
Pages1132-1139
Number of pages8
ISBN (electronic)9781665418737
Publication statusPublished - 2021
Event17th International Conference on Automation Science and Engineering (CASE) - Lyon, France
Duration: 23 Aug 202127 Aug 2021
Conference number: 17

Publication series

NameIEEE International Conference on Automation Science and Engineering
Volume2021-August
ISSN (Print)2161-8070
ISSN (electronic)2161-8089

Abstract

Taking advantage of the complementary error characteristics of Light Detection and Ranging (LiDAR) and stereo camera reconstruction, we propose a set-membership-based method for fusing LiDAR information with dense stereo data under consideration of interval uncertainty of all measurements and calibration parameters. Employing interval analysis, we can propagate the uncertainties to the extraction of distinct features in a straightforward manner. To show the applicability of our approach, we use the fused information for dead reckoning. In contrast to other works, we can consistently propagate the sensor uncertainties to the localization of the robot. Further, we can provide guaranteed bounds for the relative motion between consecutive frames. Using real data we validate that our approach is indeed able to always enclose the true pose of the robot.

ASJC Scopus subject areas

Cite this

Stereo-Visual-LiDAR Sensor Fusion Using Set-Membership Methods. / Ehambram, Aaronkumar; Voges, Raphael; Wagner, Bernardo.
2021 IEEE 17th International Conference on Automation Science and Engineering (CASE). 2021. p. 1132-1139 (IEEE International Conference on Automation Science and Engineering; Vol. 2021-August).

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Ehambram, A, Voges, R & Wagner, B 2021, Stereo-Visual-LiDAR Sensor Fusion Using Set-Membership Methods. in 2021 IEEE 17th International Conference on Automation Science and Engineering (CASE). IEEE International Conference on Automation Science and Engineering, vol. 2021-August, pp. 1132-1139, 17th International Conference on Automation Science and Engineering (CASE), Lyon, France, 23 Aug 2021. https://doi.org/10.1109/case49439.2021.9551516
Ehambram, A., Voges, R., & Wagner, B. (2021). Stereo-Visual-LiDAR Sensor Fusion Using Set-Membership Methods. In 2021 IEEE 17th International Conference on Automation Science and Engineering (CASE) (pp. 1132-1139). (IEEE International Conference on Automation Science and Engineering; Vol. 2021-August). https://doi.org/10.1109/case49439.2021.9551516
Ehambram A, Voges R, Wagner B. Stereo-Visual-LiDAR Sensor Fusion Using Set-Membership Methods. In 2021 IEEE 17th International Conference on Automation Science and Engineering (CASE). 2021. p. 1132-1139. (IEEE International Conference on Automation Science and Engineering). doi: 10.1109/case49439.2021.9551516
Ehambram, Aaronkumar ; Voges, Raphael ; Wagner, Bernardo. / Stereo-Visual-LiDAR Sensor Fusion Using Set-Membership Methods. 2021 IEEE 17th International Conference on Automation Science and Engineering (CASE). 2021. pp. 1132-1139 (IEEE International Conference on Automation Science and Engineering).
Download
@inproceedings{d3bf616f558c4c4e8e31262c2cc42b4a,
title = "Stereo-Visual-LiDAR Sensor Fusion Using Set-Membership Methods",
abstract = "Taking advantage of the complementary error characteristics of Light Detection and Ranging (LiDAR) and stereo camera reconstruction, we propose a set-membership-based method for fusing LiDAR information with dense stereo data under consideration of interval uncertainty of all measurements and calibration parameters. Employing interval analysis, we can propagate the uncertainties to the extraction of distinct features in a straightforward manner. To show the applicability of our approach, we use the fused information for dead reckoning. In contrast to other works, we can consistently propagate the sensor uncertainties to the localization of the robot. Further, we can provide guaranteed bounds for the relative motion between consecutive frames. Using real data we validate that our approach is indeed able to always enclose the true pose of the robot.",
author = "Aaronkumar Ehambram and Raphael Voges and Bernardo Wagner",
note = "Funding Information: This work was supported by the German Research Foundation (DFG) as part of the Research Training Group i.c.sens [RTG 2159].; 17th International Conference on Automation Science and Engineering (CASE) ; Conference date: 23-08-2021 Through 27-08-2021",
year = "2021",
doi = "10.1109/case49439.2021.9551516",
language = "English",
isbn = "978-1-6654-4809-3",
series = "IEEE International Conference on Automation Science and Engineering",
pages = "1132--1139",
booktitle = "2021 IEEE 17th International Conference on Automation Science and Engineering (CASE)",

}

Download

TY - GEN

T1 - Stereo-Visual-LiDAR Sensor Fusion Using Set-Membership Methods

AU - Ehambram, Aaronkumar

AU - Voges, Raphael

AU - Wagner, Bernardo

N1 - Conference code: 17

PY - 2021

Y1 - 2021

N2 - Taking advantage of the complementary error characteristics of Light Detection and Ranging (LiDAR) and stereo camera reconstruction, we propose a set-membership-based method for fusing LiDAR information with dense stereo data under consideration of interval uncertainty of all measurements and calibration parameters. Employing interval analysis, we can propagate the uncertainties to the extraction of distinct features in a straightforward manner. To show the applicability of our approach, we use the fused information for dead reckoning. In contrast to other works, we can consistently propagate the sensor uncertainties to the localization of the robot. Further, we can provide guaranteed bounds for the relative motion between consecutive frames. Using real data we validate that our approach is indeed able to always enclose the true pose of the robot.

AB - Taking advantage of the complementary error characteristics of Light Detection and Ranging (LiDAR) and stereo camera reconstruction, we propose a set-membership-based method for fusing LiDAR information with dense stereo data under consideration of interval uncertainty of all measurements and calibration parameters. Employing interval analysis, we can propagate the uncertainties to the extraction of distinct features in a straightforward manner. To show the applicability of our approach, we use the fused information for dead reckoning. In contrast to other works, we can consistently propagate the sensor uncertainties to the localization of the robot. Further, we can provide guaranteed bounds for the relative motion between consecutive frames. Using real data we validate that our approach is indeed able to always enclose the true pose of the robot.

UR - http://www.scopus.com/inward/record.url?scp=85117044055&partnerID=8YFLogxK

U2 - 10.1109/case49439.2021.9551516

DO - 10.1109/case49439.2021.9551516

M3 - Conference contribution

SN - 978-1-6654-4809-3

T3 - IEEE International Conference on Automation Science and Engineering

SP - 1132

EP - 1139

BT - 2021 IEEE 17th International Conference on Automation Science and Engineering (CASE)

T2 - 17th International Conference on Automation Science and Engineering (CASE)

Y2 - 23 August 2021 through 27 August 2021

ER -

By the same author(s)