Improving 3d pedestrian detection for wearable sensor data with 2d human pose

Research output: Contribution to journalConference articleResearchpeer review

Authors

View graph of relations

Details

Original languageEnglish
Pages (from-to)219-226
Number of pages8
JournalISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Volume5
Issue number4
Publication statusPublished - 18 May 2022
Event2022 24th ISPRS Congress on Imaging Today, Foreseeing Tomorrow, Commission IV - Nice, France
Duration: 6 Jun 202211 Jun 2022

Abstract

Collisions and safety are important concepts when dealing with urban designs like shared spaces. As pedestrians (especially the elderly and disabled people) are more vulnerable to accidents, realising an intelligent mobility aid to avoid collisions is a direction of research that could improve safety using a wearable device. Also, with the improvements in technologies for visualisation and their capabilities to render 3D virtual content, AR devices could be used to realise virtual infrastructure and virtual traffic systems. Such devices (e.g., Hololens) scan the environment using stereo and ToF (Time-of-Flight) sensors, which in principle can be used to detect surrounding objects, including dynamic agents such as pedestrians. This can be used as basis to predict collisions. To envision an AR device as a safety aid and demonstrate its 3D object detection capability (in particular: pedestrian detection), we propose an improvement to the 3D object detection framework Frustum Pointnet with human pose and apply it on the data from an AR device. Using the data from such a device in an indoor setting, we conducted a comparative study to investigate how high level 2D human pose features in our approach could help to improve the detection performance of orientated 3D pedestrian instances over Frustum Pointnet.

Keywords

    3D pedestrian detection, augmented reality, human pose estimation, shared space, wearable sensor

ASJC Scopus subject areas

Cite this

Improving 3d pedestrian detection for wearable sensor data with 2d human pose. / Kamalasanan, V.; Feng, Y.; Sester, M.
In: ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol. 5, No. 4, 18.05.2022, p. 219-226.

Research output: Contribution to journalConference articleResearchpeer review

Kamalasanan, V, Feng, Y & Sester, M 2022, 'Improving 3d pedestrian detection for wearable sensor data with 2d human pose', ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. 5, no. 4, pp. 219-226. https://doi.org/10.5194/isprs-Annals-V-4-2022-219-2022
Kamalasanan, V., Feng, Y., & Sester, M. (2022). Improving 3d pedestrian detection for wearable sensor data with 2d human pose. ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, 5(4), 219-226. https://doi.org/10.5194/isprs-Annals-V-4-2022-219-2022
Kamalasanan V, Feng Y, Sester M. Improving 3d pedestrian detection for wearable sensor data with 2d human pose. ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences. 2022 May 18;5(4):219-226. doi: 10.5194/isprs-Annals-V-4-2022-219-2022
Kamalasanan, V. ; Feng, Y. ; Sester, M. / Improving 3d pedestrian detection for wearable sensor data with 2d human pose. In: ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences. 2022 ; Vol. 5, No. 4. pp. 219-226.
Download
@article{ef63174c85664128aaa6a37e16b4f226,
title = "Improving 3d pedestrian detection for wearable sensor data with 2d human pose",
abstract = "Collisions and safety are important concepts when dealing with urban designs like shared spaces. As pedestrians (especially the elderly and disabled people) are more vulnerable to accidents, realising an intelligent mobility aid to avoid collisions is a direction of research that could improve safety using a wearable device. Also, with the improvements in technologies for visualisation and their capabilities to render 3D virtual content, AR devices could be used to realise virtual infrastructure and virtual traffic systems. Such devices (e.g., Hololens) scan the environment using stereo and ToF (Time-of-Flight) sensors, which in principle can be used to detect surrounding objects, including dynamic agents such as pedestrians. This can be used as basis to predict collisions. To envision an AR device as a safety aid and demonstrate its 3D object detection capability (in particular: pedestrian detection), we propose an improvement to the 3D object detection framework Frustum Pointnet with human pose and apply it on the data from an AR device. Using the data from such a device in an indoor setting, we conducted a comparative study to investigate how high level 2D human pose features in our approach could help to improve the detection performance of orientated 3D pedestrian instances over Frustum Pointnet. ",
keywords = "3D pedestrian detection, augmented reality, human pose estimation, shared space, wearable sensor",
author = "V. Kamalasanan and Y. Feng and M. Sester",
year = "2022",
month = may,
day = "18",
doi = "10.5194/isprs-Annals-V-4-2022-219-2022",
language = "English",
volume = "5",
pages = "219--226",
number = "4",
note = "2022 24th ISPRS Congress on Imaging Today, Foreseeing Tomorrow, Commission IV ; Conference date: 06-06-2022 Through 11-06-2022",

}

Download

TY - JOUR

T1 - Improving 3d pedestrian detection for wearable sensor data with 2d human pose

AU - Kamalasanan, V.

AU - Feng, Y.

AU - Sester, M.

PY - 2022/5/18

Y1 - 2022/5/18

N2 - Collisions and safety are important concepts when dealing with urban designs like shared spaces. As pedestrians (especially the elderly and disabled people) are more vulnerable to accidents, realising an intelligent mobility aid to avoid collisions is a direction of research that could improve safety using a wearable device. Also, with the improvements in technologies for visualisation and their capabilities to render 3D virtual content, AR devices could be used to realise virtual infrastructure and virtual traffic systems. Such devices (e.g., Hololens) scan the environment using stereo and ToF (Time-of-Flight) sensors, which in principle can be used to detect surrounding objects, including dynamic agents such as pedestrians. This can be used as basis to predict collisions. To envision an AR device as a safety aid and demonstrate its 3D object detection capability (in particular: pedestrian detection), we propose an improvement to the 3D object detection framework Frustum Pointnet with human pose and apply it on the data from an AR device. Using the data from such a device in an indoor setting, we conducted a comparative study to investigate how high level 2D human pose features in our approach could help to improve the detection performance of orientated 3D pedestrian instances over Frustum Pointnet.

AB - Collisions and safety are important concepts when dealing with urban designs like shared spaces. As pedestrians (especially the elderly and disabled people) are more vulnerable to accidents, realising an intelligent mobility aid to avoid collisions is a direction of research that could improve safety using a wearable device. Also, with the improvements in technologies for visualisation and their capabilities to render 3D virtual content, AR devices could be used to realise virtual infrastructure and virtual traffic systems. Such devices (e.g., Hololens) scan the environment using stereo and ToF (Time-of-Flight) sensors, which in principle can be used to detect surrounding objects, including dynamic agents such as pedestrians. This can be used as basis to predict collisions. To envision an AR device as a safety aid and demonstrate its 3D object detection capability (in particular: pedestrian detection), we propose an improvement to the 3D object detection framework Frustum Pointnet with human pose and apply it on the data from an AR device. Using the data from such a device in an indoor setting, we conducted a comparative study to investigate how high level 2D human pose features in our approach could help to improve the detection performance of orientated 3D pedestrian instances over Frustum Pointnet.

KW - 3D pedestrian detection

KW - augmented reality

KW - human pose estimation

KW - shared space

KW - wearable sensor

UR - http://www.scopus.com/inward/record.url?scp=85132013934&partnerID=8YFLogxK

U2 - 10.5194/isprs-Annals-V-4-2022-219-2022

DO - 10.5194/isprs-Annals-V-4-2022-219-2022

M3 - Conference article

AN - SCOPUS:85132013934

VL - 5

SP - 219

EP - 226

JO - ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences

JF - ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences

SN - 2194-9042

IS - 4

T2 - 2022 24th ISPRS Congress on Imaging Today, Foreseeing Tomorrow, Commission IV

Y2 - 6 June 2022 through 11 June 2022

ER -

By the same author(s)