GPU-accelerated Multi-sensor 3D Mapping for Remote Control of Mobile Robots using Virtual Reality

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Autoren

Organisationseinheiten

Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Titel des SammelwerksProceedings of the 13th International Conference on Informatics in Control, Automation and Robotics
Herausgeber/-innenOleg Gusikhin, Dimitri Peaucelle, Kurosh Madani
Seiten19-29
Seitenumfang11
ISBN (elektronisch)9789897581984
PublikationsstatusVeröffentlicht - 2016

Abstract

In this paper, a new virtual reality (VR) control concept for operating robots in search and rescue (SAR) scenarios is introduced. The presented approach intuitively provides different sensor signals as RGB, thermal and active infrared images by projecting them onto 3D structures generated by a Time of Flight (ToF)-based depth camera. The multichannel 3D data are displayed using an Oculus Rift head-up-display providing additional head tracking information. The usage of 3D structures can improve the perception of scale and depth by providing stereoscopic images which cannot be generated for stand-alone 2D images. Besides the described operating concept, the main contributions of this paper are the introduction of an hybrid calibration pattern for multi-sensor calibration and a high performance 2D-to-3D mapping procedure. To ensure low latencies, all steps of the algorithm are performed parallelly on a graphics processing unit (GPU) which reduces the traditional processing time on a central processing unit (CPU) by 80.03%. Furthermore, different input images are merged according to their importance for the operator to create a multi-sensor point cloud.

ASJC Scopus Sachgebiete

Zitieren

GPU-accelerated Multi-sensor 3D Mapping for Remote Control of Mobile Robots using Virtual Reality. / Kleinschmidt, Sebastian P.; Wagner, Bernardo.
Proceedings of the 13th International Conference on Informatics in Control, Automation and Robotics. Hrsg. / Oleg Gusikhin; Dimitri Peaucelle; Kurosh Madani. 2016. S. 19-29.

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Kleinschmidt, SP & Wagner, B 2016, GPU-accelerated Multi-sensor 3D Mapping for Remote Control of Mobile Robots using Virtual Reality. in O Gusikhin, D Peaucelle & K Madani (Hrsg.), Proceedings of the 13th International Conference on Informatics in Control, Automation and Robotics. S. 19-29. https://doi.org/10.5220/0005692200190029
Kleinschmidt, S. P., & Wagner, B. (2016). GPU-accelerated Multi-sensor 3D Mapping for Remote Control of Mobile Robots using Virtual Reality. In O. Gusikhin, D. Peaucelle, & K. Madani (Hrsg.), Proceedings of the 13th International Conference on Informatics in Control, Automation and Robotics (S. 19-29) https://doi.org/10.5220/0005692200190029
Kleinschmidt SP, Wagner B. GPU-accelerated Multi-sensor 3D Mapping for Remote Control of Mobile Robots using Virtual Reality. in Gusikhin O, Peaucelle D, Madani K, Hrsg., Proceedings of the 13th International Conference on Informatics in Control, Automation and Robotics. 2016. S. 19-29 doi: 10.5220/0005692200190029
Kleinschmidt, Sebastian P. ; Wagner, Bernardo. / GPU-accelerated Multi-sensor 3D Mapping for Remote Control of Mobile Robots using Virtual Reality. Proceedings of the 13th International Conference on Informatics in Control, Automation and Robotics. Hrsg. / Oleg Gusikhin ; Dimitri Peaucelle ; Kurosh Madani. 2016. S. 19-29
Download
@inproceedings{ec76c0317fbe43c3b1bf9e0f68ff1431,
title = "GPU-accelerated Multi-sensor 3D Mapping for Remote Control of Mobile Robots using Virtual Reality",
abstract = "In this paper, a new virtual reality (VR) control concept for operating robots in search and rescue (SAR) scenarios is introduced. The presented approach intuitively provides different sensor signals as RGB, thermal and active infrared images by projecting them onto 3D structures generated by a Time of Flight (ToF)-based depth camera. The multichannel 3D data are displayed using an Oculus Rift head-up-display providing additional head tracking information. The usage of 3D structures can improve the perception of scale and depth by providing stereoscopic images which cannot be generated for stand-alone 2D images. Besides the described operating concept, the main contributions of this paper are the introduction of an hybrid calibration pattern for multi-sensor calibration and a high performance 2D-to-3D mapping procedure. To ensure low latencies, all steps of the algorithm are performed parallelly on a graphics processing unit (GPU) which reduces the traditional processing time on a central processing unit (CPU) by 80.03%. Furthermore, different input images are merged according to their importance for the operator to create a multi-sensor point cloud.",
keywords = "Augumented reality, GPU-acceleration, Sensorfusion, Virtual environments, Augumented Reality, Virtual Environments",
author = "Kleinschmidt, {Sebastian P.} and Bernardo Wagner",
year = "2016",
doi = "10.5220/0005692200190029",
language = "English",
pages = "19--29",
editor = "Oleg Gusikhin and Dimitri Peaucelle and Kurosh Madani",
booktitle = "Proceedings of the 13th International Conference on Informatics in Control, Automation and Robotics",

}

Download

TY - GEN

T1 - GPU-accelerated Multi-sensor 3D Mapping for Remote Control of Mobile Robots using Virtual Reality

AU - Kleinschmidt, Sebastian P.

AU - Wagner, Bernardo

PY - 2016

Y1 - 2016

N2 - In this paper, a new virtual reality (VR) control concept for operating robots in search and rescue (SAR) scenarios is introduced. The presented approach intuitively provides different sensor signals as RGB, thermal and active infrared images by projecting them onto 3D structures generated by a Time of Flight (ToF)-based depth camera. The multichannel 3D data are displayed using an Oculus Rift head-up-display providing additional head tracking information. The usage of 3D structures can improve the perception of scale and depth by providing stereoscopic images which cannot be generated for stand-alone 2D images. Besides the described operating concept, the main contributions of this paper are the introduction of an hybrid calibration pattern for multi-sensor calibration and a high performance 2D-to-3D mapping procedure. To ensure low latencies, all steps of the algorithm are performed parallelly on a graphics processing unit (GPU) which reduces the traditional processing time on a central processing unit (CPU) by 80.03%. Furthermore, different input images are merged according to their importance for the operator to create a multi-sensor point cloud.

AB - In this paper, a new virtual reality (VR) control concept for operating robots in search and rescue (SAR) scenarios is introduced. The presented approach intuitively provides different sensor signals as RGB, thermal and active infrared images by projecting them onto 3D structures generated by a Time of Flight (ToF)-based depth camera. The multichannel 3D data are displayed using an Oculus Rift head-up-display providing additional head tracking information. The usage of 3D structures can improve the perception of scale and depth by providing stereoscopic images which cannot be generated for stand-alone 2D images. Besides the described operating concept, the main contributions of this paper are the introduction of an hybrid calibration pattern for multi-sensor calibration and a high performance 2D-to-3D mapping procedure. To ensure low latencies, all steps of the algorithm are performed parallelly on a graphics processing unit (GPU) which reduces the traditional processing time on a central processing unit (CPU) by 80.03%. Furthermore, different input images are merged according to their importance for the operator to create a multi-sensor point cloud.

KW - Augumented reality

KW - GPU-acceleration

KW - Sensorfusion

KW - Virtual environments

KW - Augumented Reality

KW - Virtual Environments

UR - http://www.scopus.com/inward/record.url?scp=85013066178&partnerID=8YFLogxK

U2 - 10.5220/0005692200190029

DO - 10.5220/0005692200190029

M3 - Conference contribution

SP - 19

EP - 29

BT - Proceedings of the 13th International Conference on Informatics in Control, Automation and Robotics

A2 - Gusikhin, Oleg

A2 - Peaucelle, Dimitri

A2 - Madani, Kurosh

ER -

Von denselben Autoren