Fusion of radar, LiDAR and thermal information for hazard detection in low visibility environments.

Publikation: KonferenzbeitragPaperForschungPeer-Review

Autorschaft

Organisationseinheiten

Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Seiten96-101
Seitenumfang6
PublikationsstatusVeröffentlicht - 26 Okt. 2017
Veranstaltung2017 IEEE International Symposium on Safety, Securtiy and Rescue Robotics (SSRR) - Shanghai, China, Shanghai, China
Dauer: 11 Okt. 201713 Okt. 2017

Konferenz

Konferenz2017 IEEE International Symposium on Safety, Securtiy and Rescue Robotics (SSRR)
KurztitelSSRR
Land/GebietChina
OrtShanghai
Zeitraum11 Okt. 201713 Okt. 2017

Abstract

Nowadays, mobile robots are widely used to support fire brigades in search and rescue missions. The utilization of those robots - especially under low visibility conditions due to smoke, fog or dust - is limited. Under these circumstances, environmental perception is still a huge challenge. In this work, we present an approach on using LiDAR, radar and thermal imaging in order to detect hazards that are potentially harmful to the robot or firefighters. We show the benefits of fusing LiDAR and radar before projecting temperatures recorded with a thermal imaging camera onto the range scans. Additionally, a hotspot detection method using the tempered range scans is presented. We demonstrate the functionality of our approach by teleoperating a robot through a smoky room.

ASJC Scopus Sachgebiete

Zitieren

Fusion of radar, LiDAR and thermal information for hazard detection in low visibility environments. / Fritsche, Paul; Zeise, Björn; Hemme, Patrick et al.
2017. 96-101 Beitrag in 2017 IEEE International Symposium on Safety, Securtiy and Rescue Robotics (SSRR), Shanghai, China.

Publikation: KonferenzbeitragPaperForschungPeer-Review

Fritsche, P, Zeise, B, Hemme, P & Wagner, B 2017, 'Fusion of radar, LiDAR and thermal information for hazard detection in low visibility environments.', Beitrag in 2017 IEEE International Symposium on Safety, Securtiy and Rescue Robotics (SSRR), Shanghai, China, 11 Okt. 2017 - 13 Okt. 2017 S. 96-101. https://doi.org/10.1109/ssrr.2017.8088146
Fritsche, P., Zeise, B., Hemme, P., & Wagner, B. (2017). Fusion of radar, LiDAR and thermal information for hazard detection in low visibility environments.. 96-101. Beitrag in 2017 IEEE International Symposium on Safety, Securtiy and Rescue Robotics (SSRR), Shanghai, China. https://doi.org/10.1109/ssrr.2017.8088146
Fritsche P, Zeise B, Hemme P, Wagner B. Fusion of radar, LiDAR and thermal information for hazard detection in low visibility environments.. 2017. Beitrag in 2017 IEEE International Symposium on Safety, Securtiy and Rescue Robotics (SSRR), Shanghai, China. doi: 10.1109/ssrr.2017.8088146
Fritsche, Paul ; Zeise, Björn ; Hemme, Patrick et al. / Fusion of radar, LiDAR and thermal information for hazard detection in low visibility environments. Beitrag in 2017 IEEE International Symposium on Safety, Securtiy and Rescue Robotics (SSRR), Shanghai, China.6 S.
Download
@conference{e5dc1069d1e84a97aba94670d3768741,
title = "Fusion of radar, LiDAR and thermal information for hazard detection in low visibility environments.",
abstract = "Nowadays, mobile robots are widely used to support fire brigades in search and rescue missions. The utilization of those robots - especially under low visibility conditions due to smoke, fog or dust - is limited. Under these circumstances, environmental perception is still a huge challenge. In this work, we present an approach on using LiDAR, radar and thermal imaging in order to detect hazards that are potentially harmful to the robot or firefighters. We show the benefits of fusing LiDAR and radar before projecting temperatures recorded with a thermal imaging camera onto the range scans. Additionally, a hotspot detection method using the tempered range scans is presented. We demonstrate the functionality of our approach by teleoperating a robot through a smoky room.",
author = "Paul Fritsche and Bj{\"o}rn Zeise and Patrick Hemme and Bernardo Wagner",
note = "DBLP's bibliographic metadata records provided through http://dblp.org/search/publ/api are distributed under a Creative Commons CC0 1.0 Universal Public Domain Dedication. Although the bibliographic metadata records are provided consistent with CC0 1.0 Dedication, the content described by the metadata records is not. Content may be subject to copyright, rights of privacy, rights of publicity and other restrictions. Funding Information: ACKNOWLEDGMENT This work has partly been supported within H2020-ICT by the European Commission under grant agreement number 645101 (SmokeBot).; 2017 IEEE International Symposium on Safety, Securtiy and Rescue Robotics (SSRR), SSRR ; Conference date: 11-10-2017 Through 13-10-2017",
year = "2017",
month = oct,
day = "26",
doi = "10.1109/ssrr.2017.8088146",
language = "English",
pages = "96--101",

}

Download

TY - CONF

T1 - Fusion of radar, LiDAR and thermal information for hazard detection in low visibility environments.

AU - Fritsche, Paul

AU - Zeise, Björn

AU - Hemme, Patrick

AU - Wagner, Bernardo

N1 - DBLP's bibliographic metadata records provided through http://dblp.org/search/publ/api are distributed under a Creative Commons CC0 1.0 Universal Public Domain Dedication. Although the bibliographic metadata records are provided consistent with CC0 1.0 Dedication, the content described by the metadata records is not. Content may be subject to copyright, rights of privacy, rights of publicity and other restrictions. Funding Information: ACKNOWLEDGMENT This work has partly been supported within H2020-ICT by the European Commission under grant agreement number 645101 (SmokeBot).

PY - 2017/10/26

Y1 - 2017/10/26

N2 - Nowadays, mobile robots are widely used to support fire brigades in search and rescue missions. The utilization of those robots - especially under low visibility conditions due to smoke, fog or dust - is limited. Under these circumstances, environmental perception is still a huge challenge. In this work, we present an approach on using LiDAR, radar and thermal imaging in order to detect hazards that are potentially harmful to the robot or firefighters. We show the benefits of fusing LiDAR and radar before projecting temperatures recorded with a thermal imaging camera onto the range scans. Additionally, a hotspot detection method using the tempered range scans is presented. We demonstrate the functionality of our approach by teleoperating a robot through a smoky room.

AB - Nowadays, mobile robots are widely used to support fire brigades in search and rescue missions. The utilization of those robots - especially under low visibility conditions due to smoke, fog or dust - is limited. Under these circumstances, environmental perception is still a huge challenge. In this work, we present an approach on using LiDAR, radar and thermal imaging in order to detect hazards that are potentially harmful to the robot or firefighters. We show the benefits of fusing LiDAR and radar before projecting temperatures recorded with a thermal imaging camera onto the range scans. Additionally, a hotspot detection method using the tempered range scans is presented. We demonstrate the functionality of our approach by teleoperating a robot through a smoky room.

UR - http://www.scopus.com/inward/record.url?scp=85040222542&partnerID=8YFLogxK

U2 - 10.1109/ssrr.2017.8088146

DO - 10.1109/ssrr.2017.8088146

M3 - Paper

SP - 96

EP - 101

T2 - 2017 IEEE International Symposium on Safety, Securtiy and Rescue Robotics (SSRR)

Y2 - 11 October 2017 through 13 October 2017

ER -

Von denselben Autoren