Leveraging Dynamic Objects for Relative Localization Correction in a Connected Autonomous Vehicle Network

Research output: Contribution to journalConference articleResearchpeer review

Authors

View graph of relations

Details

Original languageEnglish
Pages (from-to)101-109
Number of pages9
JournalISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Volume5
Issue number1
Early online date17 May 2022
Publication statusPublished - 2022
Event2022 24th ISPRS Congress on Imaging Today, Foreseeing Tomorrow, Commission I - Nice, France
Duration: 6 Jun 202211 Jun 2022

Abstract

High-accurate localization is crucial for the safety and reliability of autonomous driving, especially for the information fusion of collective perception that aims to further improve road safety by sharing information in a communication network of ConnectedAutonomous Vehicles (CAV). In this scenario, small localization errors can impose additional difficulty on fusing the information from different CAVs. In this paper, we propose a RANSAC-based (RANdom SAmple Consensus) method to correct the relative localization errors between two CAVs in order to ease the information fusion among the CAVs. Different from previous LiDAR-based localization algorithms that only take the static environmental information into consideration, this method also leverages the dynamic objects for localization thanks to the real-time data sharing between CAVs. Specifically, in addition to the static objects like poles, fences, and facades, the object centers of the detected dynamic vehicles are also used as keypoints for the matching of two point sets. The experiments on the synthetic dataset COMAP show that the proposed method can greatly decrease the relative localization error between two CAVs to less than 20cmas far as there are enough vehicles and poles are correctly detected by bothCAVs. Besides, our proposed method is also highly efficient in runtime and can be used in real-time scenarios of autonomous driving.

Keywords

    Collective Perception, Localization, Point Cloud, Registration, Sensor Fusion, Sensor Network

ASJC Scopus subject areas

Cite this

Leveraging Dynamic Objects for Relative Localization Correction in a Connected Autonomous Vehicle Network. / Yuan, Y.; Sester, M.
In: ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol. 5, No. 1, 2022, p. 101-109.

Research output: Contribution to journalConference articleResearchpeer review

Yuan Y, Sester M. Leveraging Dynamic Objects for Relative Localization Correction in a Connected Autonomous Vehicle Network. ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences. 2022;5(1):101-109. Epub 2022 May 17. doi: 10.48550/arXiv.2205.09418, 10.5194/isprs-annals-V-1-2022-101-2022
Yuan, Y. ; Sester, M. / Leveraging Dynamic Objects for Relative Localization Correction in a Connected Autonomous Vehicle Network. In: ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences. 2022 ; Vol. 5, No. 1. pp. 101-109.
Download
@article{ba782c4c85b74c68ab06d824fef693c8,
title = "Leveraging Dynamic Objects for Relative Localization Correction in a Connected Autonomous Vehicle Network",
abstract = "High-accurate localization is crucial for the safety and reliability of autonomous driving, especially for the information fusion of collective perception that aims to further improve road safety by sharing information in a communication network of ConnectedAutonomous Vehicles (CAV). In this scenario, small localization errors can impose additional difficulty on fusing the information from different CAVs. In this paper, we propose a RANSAC-based (RANdom SAmple Consensus) method to correct the relative localization errors between two CAVs in order to ease the information fusion among the CAVs. Different from previous LiDAR-based localization algorithms that only take the static environmental information into consideration, this method also leverages the dynamic objects for localization thanks to the real-time data sharing between CAVs. Specifically, in addition to the static objects like poles, fences, and facades, the object centers of the detected dynamic vehicles are also used as keypoints for the matching of two point sets. The experiments on the synthetic dataset COMAP show that the proposed method can greatly decrease the relative localization error between two CAVs to less than 20cmas far as there are enough vehicles and poles are correctly detected by bothCAVs. Besides, our proposed method is also highly efficient in runtime and can be used in real-time scenarios of autonomous driving.",
keywords = "Collective Perception, Localization, Point Cloud, Registration, Sensor Fusion, Sensor Network",
author = "Y. Yuan and M. Sester",
year = "2022",
doi = "10.48550/arXiv.2205.09418",
language = "English",
volume = "5",
pages = "101--109",
number = "1",
note = "2022 24th ISPRS Congress on Imaging Today, Foreseeing Tomorrow, Commission I ; Conference date: 06-06-2022 Through 11-06-2022",

}

Download

TY - JOUR

T1 - Leveraging Dynamic Objects for Relative Localization Correction in a Connected Autonomous Vehicle Network

AU - Yuan, Y.

AU - Sester, M.

PY - 2022

Y1 - 2022

N2 - High-accurate localization is crucial for the safety and reliability of autonomous driving, especially for the information fusion of collective perception that aims to further improve road safety by sharing information in a communication network of ConnectedAutonomous Vehicles (CAV). In this scenario, small localization errors can impose additional difficulty on fusing the information from different CAVs. In this paper, we propose a RANSAC-based (RANdom SAmple Consensus) method to correct the relative localization errors between two CAVs in order to ease the information fusion among the CAVs. Different from previous LiDAR-based localization algorithms that only take the static environmental information into consideration, this method also leverages the dynamic objects for localization thanks to the real-time data sharing between CAVs. Specifically, in addition to the static objects like poles, fences, and facades, the object centers of the detected dynamic vehicles are also used as keypoints for the matching of two point sets. The experiments on the synthetic dataset COMAP show that the proposed method can greatly decrease the relative localization error between two CAVs to less than 20cmas far as there are enough vehicles and poles are correctly detected by bothCAVs. Besides, our proposed method is also highly efficient in runtime and can be used in real-time scenarios of autonomous driving.

AB - High-accurate localization is crucial for the safety and reliability of autonomous driving, especially for the information fusion of collective perception that aims to further improve road safety by sharing information in a communication network of ConnectedAutonomous Vehicles (CAV). In this scenario, small localization errors can impose additional difficulty on fusing the information from different CAVs. In this paper, we propose a RANSAC-based (RANdom SAmple Consensus) method to correct the relative localization errors between two CAVs in order to ease the information fusion among the CAVs. Different from previous LiDAR-based localization algorithms that only take the static environmental information into consideration, this method also leverages the dynamic objects for localization thanks to the real-time data sharing between CAVs. Specifically, in addition to the static objects like poles, fences, and facades, the object centers of the detected dynamic vehicles are also used as keypoints for the matching of two point sets. The experiments on the synthetic dataset COMAP show that the proposed method can greatly decrease the relative localization error between two CAVs to less than 20cmas far as there are enough vehicles and poles are correctly detected by bothCAVs. Besides, our proposed method is also highly efficient in runtime and can be used in real-time scenarios of autonomous driving.

KW - Collective Perception

KW - Localization

KW - Point Cloud

KW - Registration

KW - Sensor Fusion

KW - Sensor Network

UR - http://www.scopus.com/inward/record.url?scp=85132825218&partnerID=8YFLogxK

U2 - 10.48550/arXiv.2205.09418

DO - 10.48550/arXiv.2205.09418

M3 - Conference article

VL - 5

SP - 101

EP - 109

JO - ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences

JF - ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences

SN - 2194-9042

IS - 1

T2 - 2022 24th ISPRS Congress on Imaging Today, Foreseeing Tomorrow, Commission I

Y2 - 6 June 2022 through 11 June 2022

ER -

By the same author(s)