Abstract

Sensor calibration usually is a time consuming yet important task. While classical approaches are sensor-specific and often need calibration targets as well as a widely overlapping field of view (FOV), within this work, a cooperative intelligent vehicle is used as callibration target. The vehicleis detected in the sensor frame and then matched with the information received from the cooperative awareness messagessend by the coperative intelligent vehicle. The presented algorithm is fully automated as well as sensor-independent, relying only on a very common set of assumptions. Due to the direct registration on the world frame, no overlapping FOV is necessary. The algorithm is evaluated through experiment for four laserscanners as well as one pair of stereo cameras showing a repetition error within the measurement uncertainty of the sensors. A plausibility check rules out systematic errors that might not have been covered by evaluating the repetition error.

Comment: 6 pages, published at ITSC 2019


Original document

The different versions of the original document can be found in:

http://dx.doi.org/10.1109/itsc.2019.8917310
https://arxiv.org/abs/1911.01711,
https://ui.adsabs.harvard.edu/abs/2019arXiv191101711M/abstract,
http://arxiv.org/pdf/1911.01711.pdf,
https://academic.microsoft.com/#/detail/2990557209
Back to Top

Document information

Published on 01/01/2019

Volume 2019, 2019
DOI: 10.1109/itsc.2019.8917310
Licence: CC BY-NC-SA license

Document Score

0

Views 0
Recommendations 0

Share this document

claim authorship

Are you one of the authors of this document?