Details
Original language | English |
---|---|
Pages (from-to) | 219-226 |
Number of pages | 8 |
Journal | ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences |
Volume | 5 |
Issue number | 4 |
Publication status | Published - 18 May 2022 |
Event | 2022 24th ISPRS Congress on Imaging Today, Foreseeing Tomorrow, Commission IV - Nice, France Duration: 6 Jun 2022 → 11 Jun 2022 |
Abstract
Collisions and safety are important concepts when dealing with urban designs like shared spaces. As pedestrians (especially the elderly and disabled people) are more vulnerable to accidents, realising an intelligent mobility aid to avoid collisions is a direction of research that could improve safety using a wearable device. Also, with the improvements in technologies for visualisation and their capabilities to render 3D virtual content, AR devices could be used to realise virtual infrastructure and virtual traffic systems. Such devices (e.g., Hololens) scan the environment using stereo and ToF (Time-of-Flight) sensors, which in principle can be used to detect surrounding objects, including dynamic agents such as pedestrians. This can be used as basis to predict collisions. To envision an AR device as a safety aid and demonstrate its 3D object detection capability (in particular: pedestrian detection), we propose an improvement to the 3D object detection framework Frustum Pointnet with human pose and apply it on the data from an AR device. Using the data from such a device in an indoor setting, we conducted a comparative study to investigate how high level 2D human pose features in our approach could help to improve the detection performance of orientated 3D pedestrian instances over Frustum Pointnet.
Keywords
- 3D pedestrian detection, augmented reality, human pose estimation, shared space, wearable sensor
ASJC Scopus subject areas
- Physics and Astronomy(all)
- Instrumentation
- Environmental Science(all)
- Environmental Science (miscellaneous)
- Earth and Planetary Sciences(all)
- Earth and Planetary Sciences (miscellaneous)
Cite this
- Standard
- Harvard
- Apa
- Vancouver
- BibTeX
- RIS
In: ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol. 5, No. 4, 18.05.2022, p. 219-226.
Research output: Contribution to journal › Conference article › Research › peer review
}
TY - JOUR
T1 - Improving 3d pedestrian detection for wearable sensor data with 2d human pose
AU - Kamalasanan, V.
AU - Feng, Y.
AU - Sester, M.
PY - 2022/5/18
Y1 - 2022/5/18
N2 - Collisions and safety are important concepts when dealing with urban designs like shared spaces. As pedestrians (especially the elderly and disabled people) are more vulnerable to accidents, realising an intelligent mobility aid to avoid collisions is a direction of research that could improve safety using a wearable device. Also, with the improvements in technologies for visualisation and their capabilities to render 3D virtual content, AR devices could be used to realise virtual infrastructure and virtual traffic systems. Such devices (e.g., Hololens) scan the environment using stereo and ToF (Time-of-Flight) sensors, which in principle can be used to detect surrounding objects, including dynamic agents such as pedestrians. This can be used as basis to predict collisions. To envision an AR device as a safety aid and demonstrate its 3D object detection capability (in particular: pedestrian detection), we propose an improvement to the 3D object detection framework Frustum Pointnet with human pose and apply it on the data from an AR device. Using the data from such a device in an indoor setting, we conducted a comparative study to investigate how high level 2D human pose features in our approach could help to improve the detection performance of orientated 3D pedestrian instances over Frustum Pointnet.
AB - Collisions and safety are important concepts when dealing with urban designs like shared spaces. As pedestrians (especially the elderly and disabled people) are more vulnerable to accidents, realising an intelligent mobility aid to avoid collisions is a direction of research that could improve safety using a wearable device. Also, with the improvements in technologies for visualisation and their capabilities to render 3D virtual content, AR devices could be used to realise virtual infrastructure and virtual traffic systems. Such devices (e.g., Hololens) scan the environment using stereo and ToF (Time-of-Flight) sensors, which in principle can be used to detect surrounding objects, including dynamic agents such as pedestrians. This can be used as basis to predict collisions. To envision an AR device as a safety aid and demonstrate its 3D object detection capability (in particular: pedestrian detection), we propose an improvement to the 3D object detection framework Frustum Pointnet with human pose and apply it on the data from an AR device. Using the data from such a device in an indoor setting, we conducted a comparative study to investigate how high level 2D human pose features in our approach could help to improve the detection performance of orientated 3D pedestrian instances over Frustum Pointnet.
KW - 3D pedestrian detection
KW - augmented reality
KW - human pose estimation
KW - shared space
KW - wearable sensor
UR - http://www.scopus.com/inward/record.url?scp=85132013934&partnerID=8YFLogxK
U2 - 10.5194/isprs-Annals-V-4-2022-219-2022
DO - 10.5194/isprs-Annals-V-4-2022-219-2022
M3 - Conference article
AN - SCOPUS:85132013934
VL - 5
SP - 219
EP - 226
JO - ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
JF - ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
SN - 2194-9042
IS - 4
T2 - 2022 24th ISPRS Congress on Imaging Today, Foreseeing Tomorrow, Commission IV
Y2 - 6 June 2022 through 11 June 2022
ER -