Details
Originalsprache | Englisch |
---|---|
Seiten | 2685-2690 |
Seitenumfang | 6 |
Publikationsstatus | Veröffentlicht - 13 Dez. 2017 |
Veranstaltung | 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2017 - Vancouver, Kanada Dauer: 24 Sept. 2017 → 28 Sept. 2017 |
Konferenz
Konferenz | 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2017 |
---|---|
Land/Gebiet | Kanada |
Ort | Vancouver |
Zeitraum | 24 Sept. 2017 → 28 Sept. 2017 |
Abstract
LiDAR scanners are commonly used for mapping and localization with mobile robots. But, they cannot see through occlusions, as it occurs in harsh environments, containing smoke, fog or dust. Radar scanners can overcome this problem, but they have lower range and angular resolution, and cannot represent an environment in the same quality. In the following article, we present the integration of fused LiDAR and radar data into a SLAM cycle and continue our work from [1], where we presented first results regarding a feature based and a scan matching-based approach for SLAM in environments with changing visibility using LiDAR and radar sensors. New content in this article, the data fusion takes place on scan level as well as on map level and aims to result in an optimum map quality considering the visibility situation. Additionally, we collected more data during an indoor experiment involving real fog (see Fig. 1). Besides the structure of the environment, we can model aerosol concentration with fused LiDAR and Radar data in parallel to the mapping process with a finite difference model without involving a smoke or gas sensor. Overall, our method allows the modeling of the structure of an environment including dynamic distribution of aerosol concentration.
ASJC Scopus Sachgebiete
- Ingenieurwesen (insg.)
- Steuerungs- und Systemtechnik
- Informatik (insg.)
- Software
- Informatik (insg.)
- Maschinelles Sehen und Mustererkennung
- Informatik (insg.)
- Angewandte Informatik
Zitieren
- Standard
- Harvard
- Apa
- Vancouver
- BibTex
- RIS
2017. 2685-2690 Beitrag in 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2017, Vancouver, Kanada.
Publikation: Konferenzbeitrag › Paper › Forschung › Peer-Review
}
TY - CONF
T1 - Modeling structure and aerosol concentration with fused radar and LiDAR data in environments with changing visibility.
AU - Fritsche, Paul
AU - Wagner, Bernardo
N1 - DBLP's bibliographic metadata records provided through http://dblp.org/search/publ/api are distributed under a Creative Commons CC0 1.0 Universal Public Domain Dedication. Although the bibliographic metadata records are provided consistent with CC0 1.0 Dedication, the content described by the metadata records is not. Content may be subject to copyright, rights of privacy, rights of publicity and other restrictions. Funding Information: This work has partly been supported within H2020-ICT by the European Commission under grant agreement number 645101 (SmokeBot) 1Paul Fritsche and Bernardo Wagner are with Faculty of Electrical Engineering and Computer Science, Leibniz Universität Hannover, 30167 Hannover, Germany fritsche|wagner@rts.uni-hannover.de Funding Information: ACKNOWLEDGMENT This work has partly been supported within H2020-ICT by the European Commission under grant agreement number 645101 (SmokeBot)
PY - 2017/12/13
Y1 - 2017/12/13
N2 - LiDAR scanners are commonly used for mapping and localization with mobile robots. But, they cannot see through occlusions, as it occurs in harsh environments, containing smoke, fog or dust. Radar scanners can overcome this problem, but they have lower range and angular resolution, and cannot represent an environment in the same quality. In the following article, we present the integration of fused LiDAR and radar data into a SLAM cycle and continue our work from [1], where we presented first results regarding a feature based and a scan matching-based approach for SLAM in environments with changing visibility using LiDAR and radar sensors. New content in this article, the data fusion takes place on scan level as well as on map level and aims to result in an optimum map quality considering the visibility situation. Additionally, we collected more data during an indoor experiment involving real fog (see Fig. 1). Besides the structure of the environment, we can model aerosol concentration with fused LiDAR and Radar data in parallel to the mapping process with a finite difference model without involving a smoke or gas sensor. Overall, our method allows the modeling of the structure of an environment including dynamic distribution of aerosol concentration.
AB - LiDAR scanners are commonly used for mapping and localization with mobile robots. But, they cannot see through occlusions, as it occurs in harsh environments, containing smoke, fog or dust. Radar scanners can overcome this problem, but they have lower range and angular resolution, and cannot represent an environment in the same quality. In the following article, we present the integration of fused LiDAR and radar data into a SLAM cycle and continue our work from [1], where we presented first results regarding a feature based and a scan matching-based approach for SLAM in environments with changing visibility using LiDAR and radar sensors. New content in this article, the data fusion takes place on scan level as well as on map level and aims to result in an optimum map quality considering the visibility situation. Additionally, we collected more data during an indoor experiment involving real fog (see Fig. 1). Besides the structure of the environment, we can model aerosol concentration with fused LiDAR and Radar data in parallel to the mapping process with a finite difference model without involving a smoke or gas sensor. Overall, our method allows the modeling of the structure of an environment including dynamic distribution of aerosol concentration.
UR - http://www.scopus.com/inward/record.url?scp=85040232618&partnerID=8YFLogxK
U2 - 10.1109/iros.2017.8206093
DO - 10.1109/iros.2017.8206093
M3 - Paper
SP - 2685
EP - 2690
T2 - 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2017
Y2 - 24 September 2017 through 28 September 2017
ER -