Camera-LiDAR Calibration Using Total Station
With the rapid advancements in autonomous driving in recent years, the importance of HD (High Definition) map referenced during autonomous driving has also increased. The mainstream method for creating HD maps involves using data collected from MMS (Mobile Mapping System) equipped with multiple sens...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/11023252/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | With the rapid advancements in autonomous driving in recent years, the importance of HD (High Definition) map referenced during autonomous driving has also increased. The mainstream method for creating HD maps involves using data collected from MMS (Mobile Mapping System) equipped with multiple sensors. Accurate fusion of data from multiple sensors is required. Typically, calibration between sensors is done by simultaneously measuring the same object. However, some sensors on MMS have non-overlapping FoV (Field of View), making it difficult to accurately calibrate and fuse their data. In this study, we focus on the fusion of data from a camera and LiDAR with non-overlapping FoVs. By estimating the relative positional information between the camera and LiDAR using two same calibration target by ‘Total station’, we performed data fusion. Originally, the total station are used for civil surveying, but we show that they can be used for camera and LiDAR calibration in computer vision. Specifically, two boards of the same make are placed in the field of view of the camera and LiDAR, and the relative positions of the two boards are matched by a total station, enabling highly accurate camera and LiDAR calibration even when the fields of view do not overlap. In simulation experiments, a translational error of 15.28 mm was achieved, which is approximately 4 mm smaller than the conventional target-based method. |
|---|---|
| ISSN: | 2169-3536 |