Online Rotational Self-Calibration of LiDAR Sensors when Mounted on a Ground Vehicle. Meyer, S. August 2021. Accepted: 2021-08-06T16:05:02Z
Paper abstract bibtex This thesis presents a method for online extrinsic rotational calibration of a LiDAR sensor with minimal manual intervention. The approach leverages the expected geometry of structured environments common to ground vehicle applications to correct for sensor-to-vehicle rotational misalignment without the need of calibration targets, additional sensors, or an accurate model of vehicle kinematics or dynamics. Once an extrinsic calibration transform is estimated by this approach, it can be applied to the raw LiDAR data to transform it into a frame with known relation to other calibrated sensors and points of interest on the vehicle body, such as a control point or vehicle frame origin. This approach may be used in applications that involve the vehicle following the trajectory of a distinct, straight section of a road or pathway, and where the sensor collection initiated with the vehicle at rest on an extended level surface. To estimate the yaw offset, the road trajectory is detected, and for the roll and pitch offsets, the orientation of the ground plane in front of the vehicle is estimated. This thesis also proposes a novel road edge and lane line marking detection algorithm capable of detecting the edges and markings at arbitrary orientations and locations as required for estimating the LiDAR yaw offset, as well as details an approach for estimating the ground orientations. The approach estimates roll and pitch offsets up to 90 degrees, and yaw offsets of up to 45 degrees with respect to a level LiDAR with x-axis aligned with the forward x-axis of the vehicle it is mounted to. In testing, the estimated calibration was able to correct for dynamic roll and pitch estimations of the ground plane to a root mean square error of within 1.20 and 2.11 degrees in roll and pitch respectively over all tested scenarios, and was within 0.099 and 0.119 degrees in roll and pitch when the ground orientation remained constant with respect to gravity. The road edge orientation detection was able to detect road lines in 80% of tested frames with a root mean square error of 0.47 degrees in detected line orientation.
@unpublished{meyer_online_2021,
title = {Online {Rotational} {Self}-{Calibration} of {LiDAR} {Sensors} when {Mounted} on a {Ground} {Vehicle}},
url = {https://etd.auburn.edu//handle/10415/7938},
abstract = {This thesis presents a method for online extrinsic rotational calibration of a LiDAR sensor
with minimal manual intervention. The approach leverages the expected geometry of structured
environments common to ground vehicle applications to correct for sensor-to-vehicle rotational
misalignment without the need of calibration targets, additional sensors, or an accurate model of
vehicle kinematics or dynamics. Once an extrinsic calibration transform is estimated by this
approach, it can be applied to the raw LiDAR data to transform it into a frame with known relation
to other calibrated sensors and points of interest on the vehicle body, such as a control point or
vehicle frame origin. This approach may be used in applications that involve the vehicle following
the trajectory of a distinct, straight section of a road or pathway, and where the sensor collection
initiated with the vehicle at rest on an extended level surface. To estimate the yaw offset, the road
trajectory is detected, and for the roll and pitch offsets, the orientation of the ground plane in front
of the vehicle is estimated. This thesis also proposes a novel road edge and lane line marking
detection algorithm capable of detecting the edges and markings at arbitrary orientations and
locations as required for estimating the LiDAR yaw offset, as well as details an approach for
estimating the ground orientations. The approach estimates roll and pitch offsets up to 90 degrees,
and yaw offsets of up to 45 degrees with respect to a level LiDAR with x-axis aligned with the
forward x-axis of the vehicle it is mounted to. In testing, the estimated calibration was able to
correct for dynamic roll and pitch estimations of the ground plane to a root mean square error of
within 1.20 and 2.11 degrees in roll and pitch respectively over all tested scenarios, and was within
0.099 and 0.119 degrees in roll and pitch when the ground orientation remained constant with
respect to gravity. The road edge orientation detection was able to detect road lines in 80\% of
tested frames with a root mean square error of 0.47 degrees in detected line orientation.},
language = {en},
urldate = {2024-06-25},
author = {Meyer, Stephanie},
month = aug,
year = {2021},
note = {Accepted: 2021-08-06T16:05:02Z},
}
Downloads: 0
{"_id":"gMwjmQepDW3mQvQcb","bibbaseid":"meyer-onlinerotationalselfcalibrationoflidarsensorswhenmountedonagroundvehicle-2021","author_short":["Meyer, S."],"bibdata":{"bibtype":"unpublished","type":"unpublished","title":"Online Rotational Self-Calibration of LiDAR Sensors when Mounted on a Ground Vehicle","url":"https://etd.auburn.edu//handle/10415/7938","abstract":"This thesis presents a method for online extrinsic rotational calibration of a LiDAR sensor with minimal manual intervention. The approach leverages the expected geometry of structured environments common to ground vehicle applications to correct for sensor-to-vehicle rotational misalignment without the need of calibration targets, additional sensors, or an accurate model of vehicle kinematics or dynamics. Once an extrinsic calibration transform is estimated by this approach, it can be applied to the raw LiDAR data to transform it into a frame with known relation to other calibrated sensors and points of interest on the vehicle body, such as a control point or vehicle frame origin. This approach may be used in applications that involve the vehicle following the trajectory of a distinct, straight section of a road or pathway, and where the sensor collection initiated with the vehicle at rest on an extended level surface. To estimate the yaw offset, the road trajectory is detected, and for the roll and pitch offsets, the orientation of the ground plane in front of the vehicle is estimated. This thesis also proposes a novel road edge and lane line marking detection algorithm capable of detecting the edges and markings at arbitrary orientations and locations as required for estimating the LiDAR yaw offset, as well as details an approach for estimating the ground orientations. The approach estimates roll and pitch offsets up to 90 degrees, and yaw offsets of up to 45 degrees with respect to a level LiDAR with x-axis aligned with the forward x-axis of the vehicle it is mounted to. In testing, the estimated calibration was able to correct for dynamic roll and pitch estimations of the ground plane to a root mean square error of within 1.20 and 2.11 degrees in roll and pitch respectively over all tested scenarios, and was within 0.099 and 0.119 degrees in roll and pitch when the ground orientation remained constant with respect to gravity. The road edge orientation detection was able to detect road lines in 80% of tested frames with a root mean square error of 0.47 degrees in detected line orientation.","language":"en","urldate":"2024-06-25","author":[{"propositions":[],"lastnames":["Meyer"],"firstnames":["Stephanie"],"suffixes":[]}],"month":"August","year":"2021","note":"Accepted: 2021-08-06T16:05:02Z","bibtex":"@unpublished{meyer_online_2021,\n\ttitle = {Online {Rotational} {Self}-{Calibration} of {LiDAR} {Sensors} when {Mounted} on a {Ground} {Vehicle}},\n\turl = {https://etd.auburn.edu//handle/10415/7938},\n\tabstract = {This thesis presents a method for online extrinsic rotational calibration of a LiDAR sensor \nwith minimal manual intervention. The approach leverages the expected geometry of structured \nenvironments common to ground vehicle applications to correct for sensor-to-vehicle rotational \nmisalignment without the need of calibration targets, additional sensors, or an accurate model of \nvehicle kinematics or dynamics. Once an extrinsic calibration transform is estimated by this \napproach, it can be applied to the raw LiDAR data to transform it into a frame with known relation \nto other calibrated sensors and points of interest on the vehicle body, such as a control point or \nvehicle frame origin. This approach may be used in applications that involve the vehicle following \nthe trajectory of a distinct, straight section of a road or pathway, and where the sensor collection \ninitiated with the vehicle at rest on an extended level surface. To estimate the yaw offset, the road \ntrajectory is detected, and for the roll and pitch offsets, the orientation of the ground plane in front \nof the vehicle is estimated. This thesis also proposes a novel road edge and lane line marking \ndetection algorithm capable of detecting the edges and markings at arbitrary orientations and \nlocations as required for estimating the LiDAR yaw offset, as well as details an approach for \nestimating the ground orientations. The approach estimates roll and pitch offsets up to 90 degrees, \nand yaw offsets of up to 45 degrees with respect to a level LiDAR with x-axis aligned with the \nforward x-axis of the vehicle it is mounted to. In testing, the estimated calibration was able to \ncorrect for dynamic roll and pitch estimations of the ground plane to a root mean square error of \nwithin 1.20 and 2.11 degrees in roll and pitch respectively over all tested scenarios, and was within \n0.099 and 0.119 degrees in roll and pitch when the ground orientation remained constant with \nrespect to gravity. The road edge orientation detection was able to detect road lines in 80\\% of \ntested frames with a root mean square error of 0.47 degrees in detected line orientation.},\n\tlanguage = {en},\n\turldate = {2024-06-25},\n\tauthor = {Meyer, Stephanie},\n\tmonth = aug,\n\tyear = {2021},\n\tnote = {Accepted: 2021-08-06T16:05:02Z},\n}\n\n\n\n","author_short":["Meyer, S."],"key":"meyer_online_2021","id":"meyer_online_2021","bibbaseid":"meyer-onlinerotationalselfcalibrationoflidarsensorswhenmountedonagroundvehicle-2021","role":"author","urls":{"Paper":"https://etd.auburn.edu//handle/10415/7938"},"metadata":{"authorlinks":{}}},"bibtype":"unpublished","biburl":"https://bibbase.org/zotero-group/keb0115/5574615","dataSources":["kDK6fZ4EDThxNKDCP"],"keywords":[],"search_terms":["online","rotational","self","calibration","lidar","sensors","mounted","ground","vehicle","meyer"],"title":"Online Rotational Self-Calibration of LiDAR Sensors when Mounted on a Ground Vehicle","year":2021}