Large-scale direct SLAM for omnidirectional cameras. Caruso, D., Engel, J., & Cremers, D. IEEE International Conference on Intelligent Robots and Systems, 2015-Decem:141-148, 2015. Paper doi abstract bibtex We propose a real-time, direct monocular SLAM method for omnidirectional or wide field-of-view fisheye cameras. Both tracking (direct image alignment) and mapping (pixel-wise distance filtering) are directly formulated for the unified omnidirectional model, which can model central imaging devices with a field of view above 180 °. This is in contrast to existing direct mono-SLAM approaches like DTAM or LSD-SLAM, which operate on rectified images, in practice limiting the field of view to around 130 ° diagonally. Not only does this allows to observe - and reconstruct - a larger portion of the surrounding environment, but it also makes the system more robust to degenerate (rotation-only) movement. The two main contribution are (1) the formulation of direct image alignment for the unified omnidirectional model, and (2) a fast yet accurate approach to incremental stereo directly on distorted images. We evaluated our framework on real-world sequences taken with a 185 ° fisheye lens, and compare it to a rectified and a piecewise rectified approach.
@article{
title = {Large-scale direct SLAM for omnidirectional cameras},
type = {article},
year = {2015},
keywords = {Cameras,Computational modeling,Lenses,Nonlinear distortion,Simultaneous localization and mapping,Three-dimensional displays},
pages = {141-148},
volume = {2015-Decem},
id = {abc69db6-a50f-3b5d-bb00-5ac97162376d},
created = {2022-09-08T05:59:12.753Z},
file_attached = {true},
profile_id = {235249c2-3ed4-314a-b309-b1ea0330f5d9},
group_id = {5ec9cc91-a5d6-3de5-82f3-3ef3d98a89c1},
last_modified = {2022-09-09T05:37:39.284Z},
read = {false},
starred = {false},
authored = {false},
confirmed = {true},
hidden = {false},
folder_uuids = {97693603-b330-4e3e-8cf5-d549e6474921},
private_publication = {false},
abstract = {We propose a real-time, direct monocular SLAM method for omnidirectional or wide field-of-view fisheye cameras. Both tracking (direct image alignment) and mapping (pixel-wise distance filtering) are directly formulated for the unified omnidirectional model, which can model central imaging devices with a field of view above 180 °. This is in contrast to existing direct mono-SLAM approaches like DTAM or LSD-SLAM, which operate on rectified images, in practice limiting the field of view to around 130 ° diagonally. Not only does this allows to observe - and reconstruct - a larger portion of the surrounding environment, but it also makes the system more robust to degenerate (rotation-only) movement. The two main contribution are (1) the formulation of direct image alignment for the unified omnidirectional model, and (2) a fast yet accurate approach to incremental stereo directly on distorted images. We evaluated our framework on real-world sequences taken with a 185 ° fisheye lens, and compare it to a rectified and a piecewise rectified approach.},
bibtype = {article},
author = {Caruso, David and Engel, Jakob and Cremers, Daniel},
doi = {10.1109/IROS.2015.7353366},
journal = {IEEE International Conference on Intelligent Robots and Systems}
}
Downloads: 0
{"_id":"atq6DKyg5NqCedrqS","bibbaseid":"caruso-engel-cremers-largescaledirectslamforomnidirectionalcameras-2015","author_short":["Caruso, D.","Engel, J.","Cremers, D."],"bibdata":{"title":"Large-scale direct SLAM for omnidirectional cameras","type":"article","year":"2015","keywords":"Cameras,Computational modeling,Lenses,Nonlinear distortion,Simultaneous localization and mapping,Three-dimensional displays","pages":"141-148","volume":"2015-Decem","id":"abc69db6-a50f-3b5d-bb00-5ac97162376d","created":"2022-09-08T05:59:12.753Z","file_attached":"true","profile_id":"235249c2-3ed4-314a-b309-b1ea0330f5d9","group_id":"5ec9cc91-a5d6-3de5-82f3-3ef3d98a89c1","last_modified":"2022-09-09T05:37:39.284Z","read":false,"starred":false,"authored":false,"confirmed":"true","hidden":false,"folder_uuids":"97693603-b330-4e3e-8cf5-d549e6474921","private_publication":false,"abstract":"We propose a real-time, direct monocular SLAM method for omnidirectional or wide field-of-view fisheye cameras. Both tracking (direct image alignment) and mapping (pixel-wise distance filtering) are directly formulated for the unified omnidirectional model, which can model central imaging devices with a field of view above 180 °. This is in contrast to existing direct mono-SLAM approaches like DTAM or LSD-SLAM, which operate on rectified images, in practice limiting the field of view to around 130 ° diagonally. Not only does this allows to observe - and reconstruct - a larger portion of the surrounding environment, but it also makes the system more robust to degenerate (rotation-only) movement. The two main contribution are (1) the formulation of direct image alignment for the unified omnidirectional model, and (2) a fast yet accurate approach to incremental stereo directly on distorted images. We evaluated our framework on real-world sequences taken with a 185 ° fisheye lens, and compare it to a rectified and a piecewise rectified approach.","bibtype":"article","author":"Caruso, David and Engel, Jakob and Cremers, Daniel","doi":"10.1109/IROS.2015.7353366","journal":"IEEE International Conference on Intelligent Robots and Systems","bibtex":"@article{\n title = {Large-scale direct SLAM for omnidirectional cameras},\n type = {article},\n year = {2015},\n keywords = {Cameras,Computational modeling,Lenses,Nonlinear distortion,Simultaneous localization and mapping,Three-dimensional displays},\n pages = {141-148},\n volume = {2015-Decem},\n id = {abc69db6-a50f-3b5d-bb00-5ac97162376d},\n created = {2022-09-08T05:59:12.753Z},\n file_attached = {true},\n profile_id = {235249c2-3ed4-314a-b309-b1ea0330f5d9},\n group_id = {5ec9cc91-a5d6-3de5-82f3-3ef3d98a89c1},\n last_modified = {2022-09-09T05:37:39.284Z},\n read = {false},\n starred = {false},\n authored = {false},\n confirmed = {true},\n hidden = {false},\n folder_uuids = {97693603-b330-4e3e-8cf5-d549e6474921},\n private_publication = {false},\n abstract = {We propose a real-time, direct monocular SLAM method for omnidirectional or wide field-of-view fisheye cameras. Both tracking (direct image alignment) and mapping (pixel-wise distance filtering) are directly formulated for the unified omnidirectional model, which can model central imaging devices with a field of view above 180 °. This is in contrast to existing direct mono-SLAM approaches like DTAM or LSD-SLAM, which operate on rectified images, in practice limiting the field of view to around 130 ° diagonally. Not only does this allows to observe - and reconstruct - a larger portion of the surrounding environment, but it also makes the system more robust to degenerate (rotation-only) movement. The two main contribution are (1) the formulation of direct image alignment for the unified omnidirectional model, and (2) a fast yet accurate approach to incremental stereo directly on distorted images. We evaluated our framework on real-world sequences taken with a 185 ° fisheye lens, and compare it to a rectified and a piecewise rectified approach.},\n bibtype = {article},\n author = {Caruso, David and Engel, Jakob and Cremers, Daniel},\n doi = {10.1109/IROS.2015.7353366},\n journal = {IEEE International Conference on Intelligent Robots and Systems}\n}","author_short":["Caruso, D.","Engel, J.","Cremers, D."],"urls":{"Paper":"https://bibbase.org/service/mendeley/bfbbf840-4c42-3914-a463-19024f50b30c/file/689254de-107f-11c5-5520-0be8153da602/caruso2015_omni_lsdslam.pdf.pdf"},"biburl":"https://bibbase.org/service/mendeley/bfbbf840-4c42-3914-a463-19024f50b30c","bibbaseid":"caruso-engel-cremers-largescaledirectslamforomnidirectionalcameras-2015","role":"author","keyword":["Cameras","Computational modeling","Lenses","Nonlinear distortion","Simultaneous localization and mapping","Three-dimensional displays"],"metadata":{"authorlinks":{}},"downloads":0},"bibtype":"article","biburl":"https://bibbase.org/service/mendeley/bfbbf840-4c42-3914-a463-19024f50b30c","dataSources":["2252seNhipfTmjEBQ"],"keywords":["cameras","computational modeling","lenses","nonlinear distortion","simultaneous localization and mapping","three-dimensional displays"],"search_terms":["large","scale","direct","slam","omnidirectional","cameras","caruso","engel","cremers"],"title":"Large-scale direct SLAM for omnidirectional cameras","year":2015}