Sensor calibration usually is a time consuming yet important task. While classical approaches are sensor-specific and often need calibration targets as well as a widely overlapping field of view (FOV), within this work, a cooperative intelligent vehicle is used as callibration target. The vehicleis detected in the sensor frame and then matched with the information received from the cooperative awareness messagessend by the coperative intelligent vehicle. The presented algorithm is fully automated as well as sensor-independent, relying only on a very common set of assumptions. Due to the direct registration on the world frame, no overlapping FOV is necessary. The algorithm is evaluated through experiment for four laserscanners as well as one pair of stereo cameras showing a repetition error within the measurement uncertainty of the sensors. A plausibility check rules out systematic errors that might not have been covered by evaluating the repetition error.
Comment: 6 pages, published at ITSC 2019
The different versions of the original document can be found in:
Published on 01/01/2019
Volume 2019, 2019
DOI: 10.1109/itsc.2019.8917310
Licence: CC BY-NC-SA license
Are you one of the authors of this document?