Generating Evidential BEV Maps in Continuous Driving Space

Publikation: Beitrag in FachzeitschriftArtikelForschungPeer-Review

Autoren

Externe Organisationen

  • University of Twente
Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Seiten (von - bis)27-41
Seitenumfang15
FachzeitschriftISPRS Journal of Photogrammetry and Remote Sensing
Jahrgang204
Frühes Online-Datum8 Sept. 2023
PublikationsstatusVeröffentlicht - Okt. 2023

Abstract

Safety is critical for autonomous driving, and one aspect of improving safety is to accurately capture the uncertainties of the perception system, especially knowing the unknown. Different from only providing deterministic or probabilistic results, e.g., probabilistic object detection, that only provide partial information for the perception scenario, we propose a complete probabilistic model named GevBEV. It interprets the 2D driving space as a probabilistic Bird's Eye View (BEV) map with point-based spatial Gaussian distributions, from which one can draw evidence as the parameters for the categorical Dirichlet distribution of any new sample point in the continuous driving space. The experimental results show that GevBEV not only provides more reliable uncertainty quantification but also outperforms the previous works on the benchmark OPV2V of BEV map interpretation for cooperative perception. A critical factor in cooperative perception is the data transmission size through the communication channels. GevBEV helps reduce communication overhead by selecting only the most important information to share from the learned uncertainty, reducing the average information communicated by 80% with a slight performance drop.

ASJC Scopus Sachgebiete

Zitieren

Generating Evidential BEV Maps in Continuous Driving Space. / Yuan, Yunshuang; Cheng, Hao; Yang, Michael Ying et al.
in: ISPRS Journal of Photogrammetry and Remote Sensing, Jahrgang 204, 10.2023, S. 27-41.

Publikation: Beitrag in FachzeitschriftArtikelForschungPeer-Review

Yuan Y, Cheng H, Yang MY, Sester M. Generating Evidential BEV Maps in Continuous Driving Space. ISPRS Journal of Photogrammetry and Remote Sensing. 2023 Okt;204:27-41. Epub 2023 Sep 8. doi: 10.48550/arXiv.2302.02928, 10.1016/j.isprsjprs.2023.08.013
Yuan, Yunshuang ; Cheng, Hao ; Yang, Michael Ying et al. / Generating Evidential BEV Maps in Continuous Driving Space. in: ISPRS Journal of Photogrammetry and Remote Sensing. 2023 ; Jahrgang 204. S. 27-41.
Download
@article{3d24b78d36fa4de2be5b35bf84f2d785,
title = "Generating Evidential BEV Maps in Continuous Driving Space",
abstract = "Safety is critical for autonomous driving, and one aspect of improving safety is to accurately capture the uncertainties of the perception system, especially knowing the unknown. Different from only providing deterministic or probabilistic results, e.g., probabilistic object detection, that only provide partial information for the perception scenario, we propose a complete probabilistic model named GevBEV. It interprets the 2D driving space as a probabilistic Bird's Eye View (BEV) map with point-based spatial Gaussian distributions, from which one can draw evidence as the parameters for the categorical Dirichlet distribution of any new sample point in the continuous driving space. The experimental results show that GevBEV not only provides more reliable uncertainty quantification but also outperforms the previous works on the benchmarks OPV2V and V2V4Real of BEV map interpretation for cooperative perception in simulated and real-world driving scenarios, respectively. A critical factor in cooperative perception is the data transmission size through the communication channels. GevBEV helps reduce communication overhead by selecting only the most important information to share from the learned uncertainty, reducing the average information communicated by 87% with only a slight performance drop. Our code is published at https://github.com/YuanYunshuang/GevBEV.",
keywords = "cs.CV, Semantic segmentation, Bird's eye view, Cooperative perception, Evidential deep learning",
author = "Yunshuang Yuan and Hao Cheng and Yang, {Michael Ying} and Monika Sester",
note = "This work is supported by Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) – 227198829/GRK1931 and MSCA European Postdoctoral Fellowships under the 101062870 – VeVuSafety project.",
year = "2023",
month = oct,
doi = "10.48550/arXiv.2302.02928",
language = "English",
volume = "204",
pages = "27--41",
journal = "ISPRS Journal of Photogrammetry and Remote Sensing",
issn = "0924-2716",
publisher = "Elsevier",

}

Download

TY - JOUR

T1 - Generating Evidential BEV Maps in Continuous Driving Space

AU - Yuan, Yunshuang

AU - Cheng, Hao

AU - Yang, Michael Ying

AU - Sester, Monika

N1 - This work is supported by Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) – 227198829/GRK1931 and MSCA European Postdoctoral Fellowships under the 101062870 – VeVuSafety project.

PY - 2023/10

Y1 - 2023/10

N2 - Safety is critical for autonomous driving, and one aspect of improving safety is to accurately capture the uncertainties of the perception system, especially knowing the unknown. Different from only providing deterministic or probabilistic results, e.g., probabilistic object detection, that only provide partial information for the perception scenario, we propose a complete probabilistic model named GevBEV. It interprets the 2D driving space as a probabilistic Bird's Eye View (BEV) map with point-based spatial Gaussian distributions, from which one can draw evidence as the parameters for the categorical Dirichlet distribution of any new sample point in the continuous driving space. The experimental results show that GevBEV not only provides more reliable uncertainty quantification but also outperforms the previous works on the benchmarks OPV2V and V2V4Real of BEV map interpretation for cooperative perception in simulated and real-world driving scenarios, respectively. A critical factor in cooperative perception is the data transmission size through the communication channels. GevBEV helps reduce communication overhead by selecting only the most important information to share from the learned uncertainty, reducing the average information communicated by 87% with only a slight performance drop. Our code is published at https://github.com/YuanYunshuang/GevBEV.

AB - Safety is critical for autonomous driving, and one aspect of improving safety is to accurately capture the uncertainties of the perception system, especially knowing the unknown. Different from only providing deterministic or probabilistic results, e.g., probabilistic object detection, that only provide partial information for the perception scenario, we propose a complete probabilistic model named GevBEV. It interprets the 2D driving space as a probabilistic Bird's Eye View (BEV) map with point-based spatial Gaussian distributions, from which one can draw evidence as the parameters for the categorical Dirichlet distribution of any new sample point in the continuous driving space. The experimental results show that GevBEV not only provides more reliable uncertainty quantification but also outperforms the previous works on the benchmarks OPV2V and V2V4Real of BEV map interpretation for cooperative perception in simulated and real-world driving scenarios, respectively. A critical factor in cooperative perception is the data transmission size through the communication channels. GevBEV helps reduce communication overhead by selecting only the most important information to share from the learned uncertainty, reducing the average information communicated by 87% with only a slight performance drop. Our code is published at https://github.com/YuanYunshuang/GevBEV.

KW - cs.CV

KW - Semantic segmentation

KW - Bird's eye view

KW - Cooperative perception

KW - Evidential deep learning

UR - http://www.scopus.com/inward/record.url?scp=85170410578&partnerID=8YFLogxK

U2 - 10.48550/arXiv.2302.02928

DO - 10.48550/arXiv.2302.02928

M3 - Article

VL - 204

SP - 27

EP - 41

JO - ISPRS Journal of Photogrammetry and Remote Sensing

JF - ISPRS Journal of Photogrammetry and Remote Sensing

SN - 0924-2716

ER -

Von denselben Autoren