Online Rotational Self-Calibration of LiDAR Sensors when Mounted on a Ground Vehicle. Meyer, S. August 2021. Accepted: 2021-08-06T16:05:02Z
Online Rotational Self-Calibration of LiDAR Sensors when Mounted on a Ground Vehicle [link]Paper  abstract   bibtex   
This thesis presents a method for online extrinsic rotational calibration of a LiDAR sensor with minimal manual intervention. The approach leverages the expected geometry of structured environments common to ground vehicle applications to correct for sensor-to-vehicle rotational misalignment without the need of calibration targets, additional sensors, or an accurate model of vehicle kinematics or dynamics. Once an extrinsic calibration transform is estimated by this approach, it can be applied to the raw LiDAR data to transform it into a frame with known relation to other calibrated sensors and points of interest on the vehicle body, such as a control point or vehicle frame origin. This approach may be used in applications that involve the vehicle following the trajectory of a distinct, straight section of a road or pathway, and where the sensor collection initiated with the vehicle at rest on an extended level surface. To estimate the yaw offset, the road trajectory is detected, and for the roll and pitch offsets, the orientation of the ground plane in front of the vehicle is estimated. This thesis also proposes a novel road edge and lane line marking detection algorithm capable of detecting the edges and markings at arbitrary orientations and locations as required for estimating the LiDAR yaw offset, as well as details an approach for estimating the ground orientations. The approach estimates roll and pitch offsets up to 90 degrees, and yaw offsets of up to 45 degrees with respect to a level LiDAR with x-axis aligned with the forward x-axis of the vehicle it is mounted to. In testing, the estimated calibration was able to correct for dynamic roll and pitch estimations of the ground plane to a root mean square error of within 1.20 and 2.11 degrees in roll and pitch respectively over all tested scenarios, and was within 0.099 and 0.119 degrees in roll and pitch when the ground orientation remained constant with respect to gravity. The road edge orientation detection was able to detect road lines in 80% of tested frames with a root mean square error of 0.47 degrees in detected line orientation.
@unpublished{meyer_online_2021,
	title = {Online {Rotational} {Self}-{Calibration} of {LiDAR} {Sensors} when {Mounted} on a {Ground} {Vehicle}},
	url = {https://etd.auburn.edu//handle/10415/7938},
	abstract = {This thesis presents a method for online extrinsic rotational calibration of a LiDAR sensor 
with minimal manual intervention. The approach leverages the expected geometry of structured 
environments common to ground vehicle applications to correct for sensor-to-vehicle rotational 
misalignment without the need of calibration targets, additional sensors, or an accurate model of 
vehicle kinematics or dynamics. Once an extrinsic calibration transform is estimated by this 
approach, it can be applied to the raw LiDAR data to transform it into a frame with known relation 
to other calibrated sensors and points of interest on the vehicle body, such as a control point or 
vehicle frame origin. This approach may be used in applications that involve the vehicle following 
the trajectory of a distinct, straight section of a road or pathway, and where the sensor collection 
initiated with the vehicle at rest on an extended level surface. To estimate the yaw offset, the road 
trajectory is detected, and for the roll and pitch offsets, the orientation of the ground plane in front 
of the vehicle is estimated. This thesis also proposes a novel road edge and lane line marking 
detection algorithm capable of detecting the edges and markings at arbitrary orientations and 
locations as required for estimating the LiDAR yaw offset, as well as details an approach for 
estimating the ground orientations. The approach estimates roll and pitch offsets up to 90 degrees, 
and yaw offsets of up to 45 degrees with respect to a level LiDAR with x-axis aligned with the 
forward x-axis of the vehicle it is mounted to. In testing, the estimated calibration was able to 
correct for dynamic roll and pitch estimations of the ground plane to a root mean square error of 
within 1.20 and 2.11 degrees in roll and pitch respectively over all tested scenarios, and was within 
0.099 and 0.119 degrees in roll and pitch when the ground orientation remained constant with 
respect to gravity. The road edge orientation detection was able to detect road lines in 80\% of 
tested frames with a root mean square error of 0.47 degrees in detected line orientation.},
	language = {en},
	urldate = {2024-06-25},
	author = {Meyer, Stephanie},
	month = aug,
	year = {2021},
	note = {Accepted: 2021-08-06T16:05:02Z},
}

Downloads: 0