Loading [MathJax]/extensions/tex2jax.js

Modeling structure and aerosol concentration with fused radar and LiDAR data in environments with changing visibility.

Publikation: KonferenzbeitragPaperForschungPeer-Review

Autoren

Organisationseinheiten

Plum Print visual indicator of research metrics
  • Citations
    • Citation Indexes: 15
  • Captures
    • Readers: 53
see details

Details

OriginalspracheEnglisch
Seiten2685-2690
Seitenumfang6
PublikationsstatusVeröffentlicht - 13 Dez. 2017
Veranstaltung2017 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2017 - Vancouver, Kanada
Dauer: 24 Sept. 201728 Sept. 2017

Konferenz

Konferenz2017 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2017
Land/GebietKanada
OrtVancouver
Zeitraum24 Sept. 201728 Sept. 2017

Abstract

LiDAR scanners are commonly used for mapping and localization with mobile robots. But, they cannot see through occlusions, as it occurs in harsh environments, containing smoke, fog or dust. Radar scanners can overcome this problem, but they have lower range and angular resolution, and cannot represent an environment in the same quality. In the following article, we present the integration of fused LiDAR and radar data into a SLAM cycle and continue our work from [1], where we presented first results regarding a feature based and a scan matching-based approach for SLAM in environments with changing visibility using LiDAR and radar sensors. New content in this article, the data fusion takes place on scan level as well as on map level and aims to result in an optimum map quality considering the visibility situation. Additionally, we collected more data during an indoor experiment involving real fog (see Fig. 1). Besides the structure of the environment, we can model aerosol concentration with fused LiDAR and Radar data in parallel to the mapping process with a finite difference model without involving a smoke or gas sensor. Overall, our method allows the modeling of the structure of an environment including dynamic distribution of aerosol concentration.

ASJC Scopus Sachgebiete

Zitieren

Modeling structure and aerosol concentration with fused radar and LiDAR data in environments with changing visibility. / Fritsche, Paul; Wagner, Bernardo.
2017. 2685-2690 Beitrag in 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2017, Vancouver, Kanada.

Publikation: KonferenzbeitragPaperForschungPeer-Review

Fritsche, P & Wagner, B 2017, 'Modeling structure and aerosol concentration with fused radar and LiDAR data in environments with changing visibility.', Beitrag in 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2017, Vancouver, Kanada, 24 Sept. 2017 - 28 Sept. 2017 S. 2685-2690. https://doi.org/10.1109/iros.2017.8206093
Fritsche, P., & Wagner, B. (2017). Modeling structure and aerosol concentration with fused radar and LiDAR data in environments with changing visibility.. 2685-2690. Beitrag in 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2017, Vancouver, Kanada. https://doi.org/10.1109/iros.2017.8206093
Fritsche P, Wagner B. Modeling structure and aerosol concentration with fused radar and LiDAR data in environments with changing visibility.. 2017. Beitrag in 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2017, Vancouver, Kanada. doi: 10.1109/iros.2017.8206093
Fritsche, Paul ; Wagner, Bernardo. / Modeling structure and aerosol concentration with fused radar and LiDAR data in environments with changing visibility. Beitrag in 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2017, Vancouver, Kanada.6 S.
Download
@conference{1463355034c042ef8c2bef69cd067d1a,
title = "Modeling structure and aerosol concentration with fused radar and LiDAR data in environments with changing visibility.",
abstract = "LiDAR scanners are commonly used for mapping and localization with mobile robots. But, they cannot see through occlusions, as it occurs in harsh environments, containing smoke, fog or dust. Radar scanners can overcome this problem, but they have lower range and angular resolution, and cannot represent an environment in the same quality. In the following article, we present the integration of fused LiDAR and radar data into a SLAM cycle and continue our work from [1], where we presented first results regarding a feature based and a scan matching-based approach for SLAM in environments with changing visibility using LiDAR and radar sensors. New content in this article, the data fusion takes place on scan level as well as on map level and aims to result in an optimum map quality considering the visibility situation. Additionally, we collected more data during an indoor experiment involving real fog (see Fig. 1). Besides the structure of the environment, we can model aerosol concentration with fused LiDAR and Radar data in parallel to the mapping process with a finite difference model without involving a smoke or gas sensor. Overall, our method allows the modeling of the structure of an environment including dynamic distribution of aerosol concentration.",
author = "Paul Fritsche and Bernardo Wagner",
note = "DBLP's bibliographic metadata records provided through http://dblp.org/search/publ/api are distributed under a Creative Commons CC0 1.0 Universal Public Domain Dedication. Although the bibliographic metadata records are provided consistent with CC0 1.0 Dedication, the content described by the metadata records is not. Content may be subject to copyright, rights of privacy, rights of publicity and other restrictions. Funding Information: This work has partly been supported within H2020-ICT by the European Commission under grant agreement number 645101 (SmokeBot) 1Paul Fritsche and Bernardo Wagner are with Faculty of Electrical Engineering and Computer Science, Leibniz Universit{\"a}t Hannover, 30167 Hannover, Germany fritsche|wagner@rts.uni-hannover.de Funding Information: ACKNOWLEDGMENT This work has partly been supported within H2020-ICT by the European Commission under grant agreement number 645101 (SmokeBot); 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2017 ; Conference date: 24-09-2017 Through 28-09-2017",
year = "2017",
month = dec,
day = "13",
doi = "10.1109/iros.2017.8206093",
language = "English",
pages = "2685--2690",

}

Download

TY - CONF

T1 - Modeling structure and aerosol concentration with fused radar and LiDAR data in environments with changing visibility.

AU - Fritsche, Paul

AU - Wagner, Bernardo

N1 - DBLP's bibliographic metadata records provided through http://dblp.org/search/publ/api are distributed under a Creative Commons CC0 1.0 Universal Public Domain Dedication. Although the bibliographic metadata records are provided consistent with CC0 1.0 Dedication, the content described by the metadata records is not. Content may be subject to copyright, rights of privacy, rights of publicity and other restrictions. Funding Information: This work has partly been supported within H2020-ICT by the European Commission under grant agreement number 645101 (SmokeBot) 1Paul Fritsche and Bernardo Wagner are with Faculty of Electrical Engineering and Computer Science, Leibniz Universität Hannover, 30167 Hannover, Germany fritsche|wagner@rts.uni-hannover.de Funding Information: ACKNOWLEDGMENT This work has partly been supported within H2020-ICT by the European Commission under grant agreement number 645101 (SmokeBot)

PY - 2017/12/13

Y1 - 2017/12/13

N2 - LiDAR scanners are commonly used for mapping and localization with mobile robots. But, they cannot see through occlusions, as it occurs in harsh environments, containing smoke, fog or dust. Radar scanners can overcome this problem, but they have lower range and angular resolution, and cannot represent an environment in the same quality. In the following article, we present the integration of fused LiDAR and radar data into a SLAM cycle and continue our work from [1], where we presented first results regarding a feature based and a scan matching-based approach for SLAM in environments with changing visibility using LiDAR and radar sensors. New content in this article, the data fusion takes place on scan level as well as on map level and aims to result in an optimum map quality considering the visibility situation. Additionally, we collected more data during an indoor experiment involving real fog (see Fig. 1). Besides the structure of the environment, we can model aerosol concentration with fused LiDAR and Radar data in parallel to the mapping process with a finite difference model without involving a smoke or gas sensor. Overall, our method allows the modeling of the structure of an environment including dynamic distribution of aerosol concentration.

AB - LiDAR scanners are commonly used for mapping and localization with mobile robots. But, they cannot see through occlusions, as it occurs in harsh environments, containing smoke, fog or dust. Radar scanners can overcome this problem, but they have lower range and angular resolution, and cannot represent an environment in the same quality. In the following article, we present the integration of fused LiDAR and radar data into a SLAM cycle and continue our work from [1], where we presented first results regarding a feature based and a scan matching-based approach for SLAM in environments with changing visibility using LiDAR and radar sensors. New content in this article, the data fusion takes place on scan level as well as on map level and aims to result in an optimum map quality considering the visibility situation. Additionally, we collected more data during an indoor experiment involving real fog (see Fig. 1). Besides the structure of the environment, we can model aerosol concentration with fused LiDAR and Radar data in parallel to the mapping process with a finite difference model without involving a smoke or gas sensor. Overall, our method allows the modeling of the structure of an environment including dynamic distribution of aerosol concentration.

UR - http://www.scopus.com/inward/record.url?scp=85040232618&partnerID=8YFLogxK

U2 - 10.1109/iros.2017.8206093

DO - 10.1109/iros.2017.8206093

M3 - Paper

SP - 2685

EP - 2690

T2 - 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2017

Y2 - 24 September 2017 through 28 September 2017

ER -

Von denselben Autoren