Calibration of a Rotating or Revolving Platform with a LiDAR Sensor. Claer, M., Ferrein, A., & Schiffer, S. Applied Sciences, 9(11):2238, January, 2019.
Calibration of a Rotating or Revolving Platform with a LiDAR Sensor [link]Paper  doi  abstract   bibtex   
Perceiving its environment in 3D is an important ability for a modern robot. Today, this is often done using LiDARs which come with a strongly limited field of view (FOV), however. To extend their FOV, the sensors are mounted on driving vehicles in several different ways. This allows 3D perception even with 2D LiDARs if a corresponding localization system or technique is available. Another popular way to gain most information of the scanners is to mount them on a rotating carrier platform. In this way, their measurements in different directions can be collected and transformed into a common frame, in order to achieve a nearly full spherical perception. However, this is only possible if the kinetic chains of the platforms are known exactly, that is, if the LiDAR pose w.r.t. to its rotation center is well known. The manual measurement of these chains is often very cumbersome or sometimes even impossible to do with the necessary precision. Our paper proposes a method to calibrate the extrinsic LiDAR parameters by decoupling the rotation from the full six degrees of freedom transform and optimizing both separately. Thus, one error measure for the orientation and one for the translation with known orientation are minimized subsequently with a combination of a consecutive grid search and a gradient descent. Both error measures are inferred from spherical calibration targets. Our experiments with the method suggest that the main influences on the calibration results come from the the distance to the calibration targets, the accuracy of their center point estimation and the search grid resolution. However, our proposed calibration method improves the extrinsic parameters even with unfavourable configurations and from inaccurate initial pose guesses.
@article{Claer:Ferrein:Schiffer_ApplSci2019_Calibration,
  author       = {Claer, Mario and Ferrein, Alexander and Schiffer, Stefan},
  title        = {Calibration of a {{Rotating}} or {{Revolving Platform}} with a {{LiDAR Sensor}}},
  journal      = {Applied Sciences},
  year         = {2019},
  month        = jan,
  volume       = {9},
  number       = {11},
  pages        = {2238},
  ARTICLE-NUMBER = {2238},
  doi          = {10.3390/app9112238},
  URL          = {https://www.mdpi.com/2076-3417/9/11/2238},
  ISSN         = {2076-3417},
  language     = {en},
  copyright    = {http://creativecommons.org/licenses/by/3.0/},
  keywords     = {UPNS4D,ARTUS,calibration,extrinsic parameter,LiDAR,LRF},
  abstract     = {Perceiving its environment in 3D is an important
                  ability for a modern robot. Today, this is often
                  done using LiDARs which come with a strongly limited
                  field of view (FOV), however. To extend their FOV,
                  the sensors are mounted on driving vehicles in
                  several different ways. This allows 3D perception
                  even with 2D LiDARs if a corresponding localization
                  system or technique is available. Another popular
                  way to gain most information of the scanners is to
                  mount them on a rotating carrier platform. In this
                  way, their measurements in different directions can
                  be collected and transformed into a common frame, in
                  order to achieve a nearly full spherical
                  perception. However, this is only possible if the
                  kinetic chains of the platforms are known exactly,
                  that is, if the LiDAR pose w.r.t. to its rotation
                  center is well known. The manual measurement of
                  these chains is often very cumbersome or sometimes
                  even impossible to do with the necessary
                  precision. Our paper proposes a method to calibrate
                  the extrinsic LiDAR parameters by decoupling the
                  rotation from the full six degrees of freedom
                  transform and optimizing both separately. Thus, one
                  error measure for the orientation and one for the
                  translation with known orientation are minimized
                  subsequently with a combination of a consecutive
                  grid search and a gradient descent. Both error
                  measures are inferred from spherical calibration
                  targets. Our experiments with the method suggest
                  that the main influences on the calibration results
                  come from the the distance to the calibration
                  targets, the accuracy of their center point
                  estimation and the search grid resolution. However,
                  our proposed calibration method improves the
                  extrinsic parameters even with unfavourable
                  configurations and from inaccurate initial pose
                  guesses.},
}

Downloads: 0