var bibbase_data = {"data":"\"Loading..\"\n\n
\n\n \n\n \n\n \n \n\n \n\n \n \n\n \n\n \n
\n generated by\n \n \"bibbase.org\"\n\n \n
\n \n\n
\n\n \n\n\n
\n\n Excellent! Next you can\n create a new website with this list, or\n embed it in an existing web page by copying & pasting\n any of the following snippets.\n\n
\n JavaScript\n (easiest)\n
\n \n <script src=\"https://bibbase.org/show?bib=https%3A%2F%2Fwww.markobjelonic.com%2F%2Fpublications%2Fbibliography.bib&jsonp=1&jsonp=1\"></script>\n \n
\n\n PHP\n
\n \n <?php\n $contents = file_get_contents(\"https://bibbase.org/show?bib=https%3A%2F%2Fwww.markobjelonic.com%2F%2Fpublications%2Fbibliography.bib&jsonp=1\");\n print_r($contents);\n ?>\n \n
\n\n iFrame\n (not recommended)\n
\n \n <iframe src=\"https://bibbase.org/show?bib=https%3A%2F%2Fwww.markobjelonic.com%2F%2Fpublications%2Fbibliography.bib&jsonp=1\"></iframe>\n \n
\n\n

\n For more details see the documention.\n

\n
\n
\n\n
\n\n This is a preview! To use this list on your own web site\n or create a new web site from it,\n create a free account. The file will be added\n and you will be able to edit it in the File Manager.\n We will show you instructions once you've created your account.\n
\n\n
\n\n

To the site owner:

\n\n

Action required! Mendeley is changing its\n API. In order to keep using Mendeley with BibBase past April\n 14th, you need to:\n

    \n
  1. renew the authorization for BibBase on Mendeley, and
  2. \n
  3. update the BibBase URL\n in your page the same way you did when you initially set up\n this page.\n
  4. \n
\n

\n\n

\n \n \n Fix it now\n

\n
\n\n
\n\n\n
\n \n \n
\n
\n  \n 2021\n \n \n (3)\n \n \n
\n
\n \n \n
\n \n\n \n \n \n \n \n \n Whole-Body MPC and Online Gait Sequence Generation for Wheeled-Legged Robots.\n \n \n \n \n\n\n \n Bjelonic, M.; Grandia, R.; Harley, O.; Galliard, C.; Zimmermann, S.; and Hutter, M.\n\n\n \n\n\n\n IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). 2021.\n \n\n\n\n
\n\n\n\n \n \n \"Whole-Body pdf\n  \n \n \n \"Whole-Body video\n  \n \n\n \n\n \n link\n  \n \n\n bibtex\n \n\n \n  \n \n abstract \n \n\n \n  \n \n 72 downloads\n \n \n\n \n \n \n \n \n \n \n\n  \n \n \n \n \n \n \n \n \n \n \n\n\n\n
\n
@article{bjelonic2021wholebody,\n  author    = {Bjelonic, Marko and\n               Grandia, Ruben and\n               Harley, Oliver and\n               Galliard, Cla and\n               Zimmermann, Samuel and\n               Hutter, Marco},\n  title     = {Whole-Body MPC and Online Gait Sequence Generation for\n               Wheeled-Legged Robots},\n  journal   = {IEEE/RSJ International Conference on Intelligent Robots and\n               Systems (IROS)},\n  year      = {2021},\n  abstract  = {Our paper proposes a model predictive controller as a single-task\n               formulation that simultaneously optimizes wheel and torso motions.\n               This online joint velocity and ground reaction force optimization\n               integrates a kinodynamic model of a wheeled quadrupedal robot.\n               It defines the single rigid body dynamics along with the robot's\n               kinematics while treating the wheels as moving ground contacts.\n               With this approach, we can accurately capture the robot's rolling\n               constraint and dynamics, enabling automatic discovery of hybrid\n               maneuvers without needless motion heuristics. The formulation's\n               generality through the simultaneous optimization over the robot's\n               whole-body variables allows for a single set of parameters and \n               makes online gait sequence adaptation possible. Aperiodic gait\n               sequences are automatically found through kinematic leg utilities\n               without the need for predefined contact and lift-off timings,\n               reducing the cost of transport by up to 85%. Our experiments\n               demonstrate dynamic motions on a quadrupedal robot with non-steerable\n               wheels in challenging indoor and outdoor environments. The paper's\n               findings contribute to evaluating a decomposed, i.e., sequential\n               optimization of wheel and torso motion, and single-task motion planner\n               with a novel quantity, the prediction error, which describes how well a\n               receding horizon planner can predict the robot’s future state. To this end,\n               we report an improvement of up to 71% using our proposed\n               single-task approach, making fast locomotion feasible\n               and revealing wheeled-legged robots' full potential.},\n  keywords  = {legged robots, wheeled robots,\n               motion and path planning, optimization and optimal control},\n  url_pdf   = {files/2021_iros_bjelonic.pdf},\n  url_video = {https://youtu.be/_rPvKlvyw2w}\n}\n\n
\n
\n\n\n
\n Our paper proposes a model predictive controller as a single-task formulation that simultaneously optimizes wheel and torso motions. This online joint velocity and ground reaction force optimization integrates a kinodynamic model of a wheeled quadrupedal robot. It defines the single rigid body dynamics along with the robot's kinematics while treating the wheels as moving ground contacts. With this approach, we can accurately capture the robot's rolling constraint and dynamics, enabling automatic discovery of hybrid maneuvers without needless motion heuristics. The formulation's generality through the simultaneous optimization over the robot's whole-body variables allows for a single set of parameters and makes online gait sequence adaptation possible. Aperiodic gait sequences are automatically found through kinematic leg utilities without the need for predefined contact and lift-off timings, reducing the cost of transport by up to 85%. Our experiments demonstrate dynamic motions on a quadrupedal robot with non-steerable wheels in challenging indoor and outdoor environments. The paper's findings contribute to evaluating a decomposed, i.e., sequential optimization of wheel and torso motion, and single-task motion planner with a novel quantity, the prediction error, which describes how well a receding horizon planner can predict the robot’s future state. To this end, we report an improvement of up to 71% using our proposed single-task approach, making fast locomotion feasible and revealing wheeled-legged robots' full potential.\n
\n\n\n
\n\n\n
\n \n\n \n \n \n \n \n \n Collision-Free MPC for Legged Robots in Static and Dynamic Scenes.\n \n \n \n \n\n\n \n Gaertner, M.; Bjelonic, M.; Farshidian, F.; and Hutter, M.\n\n\n \n\n\n\n IEEE International Conference on Robotics and Automation (ICRA). 2021.\n \n\n\n\n
\n\n\n\n \n \n \"Collision-Free pdf\n  \n \n \n \"Collision-Free video\n  \n \n\n \n\n \n link\n  \n \n\n bibtex\n \n\n \n  \n \n abstract \n \n\n \n  \n \n 17 downloads\n \n \n\n \n \n \n \n \n \n \n\n  \n \n \n \n \n \n \n \n \n \n \n\n\n\n
\n
@article{gaertner2021collision,\n  author    = {Gaertner, Magnus and\n               Bjelonic, Marko and\n               Farshidian, Farbod and\n               Hutter, Marco},\n  title     = {Collision-Free MPC for Legged Robots in Static and Dynamic Scenes},\n  journal   = {IEEE International Conference on Robotics and Automation (ICRA)},\n  year      = {2021},\n  abstract  = {We present a model predictive controller (MPC) that automatically\n               discovers collision-free locomotion while simultaneously taking into account\n               the system dynamics, friction constraints, and kinematic limitations.\n               A relaxed barrier function is added to the optimization's cost function,\n               leading to collision avoidance behavior without increasing the problem's\n               computational complexity. Our holistic approach does not require any\n               heuristics and enables legged robots to find whole-body motions in the\n               presence of static and dynamic obstacles. We use a dynamically generated\n               euclidean signed distance field for static collision checking. Collision\n               checking for dynamic obstacles is modeled with moving cylinders, increasing\n               the responsiveness to fast-moving agents. Furthermore, we include a Kalman\n               filter motion prediction for moving obstacles into our receding horizon\n               planning, enabling the robot to anticipate possible future collisions. Our\n               experiments demonstrate collision-free motions on a quadrupedal robot in\n               challenging indoor environments. The robot handles complex scenes like\n               overhanging obstacles and dynamic agents by exploring motions at the robot's\n               dynamic and kinematic limits.},\n  keywords  = {legged robots, MPC, collision-free planing, mapping},\n  url_pdf   = {https://arxiv.org/pdf/2103.13987.pdf},\n  url_video = {https://youtu.be/_wkqCVz3gdg},\n}\n\n
\n
\n\n\n
\n We present a model predictive controller (MPC) that automatically discovers collision-free locomotion while simultaneously taking into account the system dynamics, friction constraints, and kinematic limitations. A relaxed barrier function is added to the optimization's cost function, leading to collision avoidance behavior without increasing the problem's computational complexity. Our holistic approach does not require any heuristics and enables legged robots to find whole-body motions in the presence of static and dynamic obstacles. We use a dynamically generated euclidean signed distance field for static collision checking. Collision checking for dynamic obstacles is modeled with moving cylinders, increasing the responsiveness to fast-moving agents. Furthermore, we include a Kalman filter motion prediction for moving obstacles into our receding horizon planning, enabling the robot to anticipate possible future collisions. Our experiments demonstrate collision-free motions on a quadrupedal robot in challenging indoor environments. The robot handles complex scenes like overhanging obstacles and dynamic agents by exploring motions at the robot's dynamic and kinematic limits.\n
\n\n\n
\n\n\n
\n \n\n \n \n \n \n \n \n CERBERUS: Autonomous Legged and Aerial Robotic Exploration in the Tunnel and Urban Circuits of the DARPA Subterranean Challenge.\n \n \n \n \n\n\n \n Marco Tranzatto, F. M.; Lukas Bernreiter, C. G.; Marco Camurri, S. M. K. K.; Tung Dang, V. R.; Johannes Loeje, D. W.; Samuel Zimmermann, H. N.; Marius Fehr, L. S.; Russell Buchanan, M. B.; Nikhil Khedekar, M. V.; Fabian Jenelten, M. D.; Timon Homberger, P. D. P.; Lorenz Wellhausen, M. K.; Takahiro Miki, S. H.; Markus Montenegro, C. P.; Fabian Tresoldi, J. C.; Giorgio Valsecchi, J. L.; Konrad Meyer, X. W.; Juan Nieto, A. S.; Marco Hutter, R. Y S.; and Mark Mueller, M. F.\n\n\n \n\n\n\n Journal of Field Robotics. 2021.\n \n\n\n\n
\n\n\n\n \n \n \"CERBERUS: pdf\n  \n \n\n \n\n \n link\n  \n \n\n bibtex\n \n\n \n  \n \n abstract \n \n\n \n  \n \n 5 downloads\n \n \n\n \n \n \n \n \n \n \n\n  \n \n \n\n\n\n
\n
@article{tranzatto2021cerberus,\n  author    = {Marco Tranzatto, Frank Mascarich, Lukas Bernreiter,\n               Carolina Godinho, Marco Camurri, Shehryar Masaud Khan Khattak,\n               Tung Dang, Victor Reijgwart, Johannes Loeje, David Wisth,\n               Samuel Zimmermann, Huan Nguyen, Marius Fehr, Lukas Solanka,\n               Russell Buchanan, Marko Bjelonic, Nikhil Khedekar, Mathieu Valceschini,\n               Fabian Jenelten, Mihir Dharmadhikari, Timon Homberger, Paolo De Petris,\n               Lorenz Wellhausen, Mihir Kulkarni, Takahiro Miki, Satchel Hirsch,\n               Markus Montenegro, Christos Papachristos, Fabian Tresoldi, Jan Carius,\n               Giorgio Valsecchi, Joonho Lee, Konrad Meyer, Xiangyu Wu, Juan Nieto,\n               Andy Smith, Marco Hutter, Roland Y Siegwart, Mark Mueller,\n               Maurice Fallon, Kostas Alexis},\n  title     = {CERBERUS: Autonomous Legged and Aerial Robotic Exploration in the Tunnel and Urban Circuits of the DARPA Subterranean Challenge},\n  journal   = {Journal of Field Robotics},\n  year      = {2021},\n  abstract  = {Autonomous exploration of subterranean environments constitutes a\n               major frontier for robotic systems as underground settings present key\n               challenges that can render robot autonomy hard to achieve. This has\n               motivated the DARPA Subterranean Challenge, where teams of robots search\n               for objects of interest in various underground environments. In response,\n               the CERBERUS system-of-systems is presented as a unified strategy towards\n               subterranean exploration using legged and flying robots.  As primary robots,\n               ANYmal quadruped systems are deployed considering their endurance and potential\n               to traverse challenging terrain. For aerial robots, both conventional and\n               collision-tolerant multirotors are utilized to explore spaces too narrow or\n               otherwise unreachable by ground systems. Anticipating degraded sensing\n               conditions, a complementary multi-modal sensor fusion approach utilizing\n               camera, LiDAR, and inertial data for resilient robot pose estimation is\n               proposed. Individual robot pose estimates are refined by a centralized\n               multi-robot map optimization approach to improve the reported location\n               accuracy of detected objects of interest in the DARPA-defined coordinate\n               frame. Furthermore, a unified exploration path planning policy is presented\n               to facilitate the autonomous operation of both legged and aerial robots in\n               complex underground networks. Finally, to enable communication between the\n               robots and the base station, CERBERUS utilizes a ground rover with a high-gain\n               antenna and an optical fiber connection to the base station, alongside breadcrumbing\n               of wireless nodes by our legged robots. We report results from the CERBERUS\n               system-of-systems deployment at the DARPA Subterranean Challenge Tunnel and Urban Circuits,\n               along with the current limitations and the lessons learned for the benefit of the community.},\n  url_pdf   = {https://www.research-collection.ethz.ch/handle/20.500.11850/489726},\n}\n
\n
\n\n\n
\n Autonomous exploration of subterranean environments constitutes a major frontier for robotic systems as underground settings present key challenges that can render robot autonomy hard to achieve. This has motivated the DARPA Subterranean Challenge, where teams of robots search for objects of interest in various underground environments. In response, the CERBERUS system-of-systems is presented as a unified strategy towards subterranean exploration using legged and flying robots. As primary robots, ANYmal quadruped systems are deployed considering their endurance and potential to traverse challenging terrain. For aerial robots, both conventional and collision-tolerant multirotors are utilized to explore spaces too narrow or otherwise unreachable by ground systems. Anticipating degraded sensing conditions, a complementary multi-modal sensor fusion approach utilizing camera, LiDAR, and inertial data for resilient robot pose estimation is proposed. Individual robot pose estimates are refined by a centralized multi-robot map optimization approach to improve the reported location accuracy of detected objects of interest in the DARPA-defined coordinate frame. Furthermore, a unified exploration path planning policy is presented to facilitate the autonomous operation of both legged and aerial robots in complex underground networks. Finally, to enable communication between the robots and the base station, CERBERUS utilizes a ground rover with a high-gain antenna and an optical fiber connection to the base station, alongside breadcrumbing of wireless nodes by our legged robots. We report results from the CERBERUS system-of-systems deployment at the DARPA Subterranean Challenge Tunnel and Urban Circuits, along with the current limitations and the lessons learned for the benefit of the community.\n
\n\n\n
\n\n\n\n\n\n
\n
\n\n
\n
\n  \n 2020\n \n \n (4)\n \n \n
\n
\n \n \n
\n \n\n \n \n \n \n \n \n Rolling in the Deep - Hybrid Locomotion for Wheeled-Legged Robots using Online Trajectory Optimization.\n \n \n \n \n\n\n \n Bjelonic, M.; Sankar, P. K.; Bellicoso, C. D; Vallery, H.; and Hutter, M.\n\n\n \n\n\n\n IEEE Robotics and Automation Letters, 5(2): 3626-3633. 2020.\n \n\n\n\n
\n\n\n\n \n \n \"Rolling pdf\n  \n \n \n \"Rolling video\n  \n \n \n \"Rolling link\n  \n \n\n \n \n doi\n  \n \n\n \n link\n  \n \n\n bibtex\n \n\n \n  \n \n abstract \n \n\n \n  \n \n 28 downloads\n \n \n\n \n \n \n \n \n \n \n\n  \n \n \n \n \n \n \n \n \n \n \n\n\n\n
\n
@article{bjelonic2020rolling,\n  author    = {Bjelonic, Marko and\n               Sankar, Prajish K. and\n               Bellicoso, Carmine D and\n               Vallery, Heike and\n               Hutter, Marco},\n  title     = {Rolling in the Deep - Hybrid Locomotion for Wheeled-Legged\n               Robots using Online Trajectory Optimization},\n  journal   = {IEEE Robotics and Automation Letters},\n  volume    = {5},\n  number    = {2},\n  pages     = {3626-3633},\n  doi       = {10.1109/LRA.2020.2979661},\n  year      = {2020},\n  abstract  = {Wheeled-legged robots have the potential for highly agile and\n               versatile locomotion. The combination of legs and wheels might\n               be a solution for any real-world application requiring rapid,\n               and long-distance mobility skills on challenging terrain. In\n               this paper, we present an online trajectory optimization\n               framework for wheeled quadrupedal robots capable of executing\n               hybrid walking-driving locomotion strategies. By breaking down\n               the optimization problem into a wheel and base trajectory\n               planning, locomotion planning for high dimensional\n               wheeled-legged robots becomes more tractable, can be solved in\n               real-time on-board in a model predictive control fashion, and\n               becomes robust against unpredicted disturbances. The reference\n               motions are tracked by a hierarchical whole-body controller that\n               sends torque commands to the robot. Our approach is verified on\n               a quadrupedal robot that is fully torque-controlled, including\n               the non-steerable wheels attached to its legs. The robot\n               performs hybrid locomotion with different gait sequences on flat\n               and rough terrain. In addition, we validated the robotic\n               platform at the Defense Advanced Research Projects Agency\n               (DARPA) Subterranean Challenge, where the robot rapidly maps,\n               navigates, and explores dynamic underground environments.},\n  keywords  = {legged robots, wheeled robots,\n               motion and path planning, optimization and optimal control},\n  url_pdf   = {files/2020_ral_bjelonic.pdf},\n  url_video = {https://youtu.be/ukY0vyM-yfY},\n  url_link  = {https://ieeexplore.ieee.org/document/9028228},\n  url_link  = {https://youtu.be/tf_twcbF4P4},\n}\n\n
\n
\n\n\n
\n Wheeled-legged robots have the potential for highly agile and versatile locomotion. The combination of legs and wheels might be a solution for any real-world application requiring rapid, and long-distance mobility skills on challenging terrain. In this paper, we present an online trajectory optimization framework for wheeled quadrupedal robots capable of executing hybrid walking-driving locomotion strategies. By breaking down the optimization problem into a wheel and base trajectory planning, locomotion planning for high dimensional wheeled-legged robots becomes more tractable, can be solved in real-time on-board in a model predictive control fashion, and becomes robust against unpredicted disturbances. The reference motions are tracked by a hierarchical whole-body controller that sends torque commands to the robot. Our approach is verified on a quadrupedal robot that is fully torque-controlled, including the non-steerable wheels attached to its legs. The robot performs hybrid locomotion with different gait sequences on flat and rough terrain. In addition, we validated the robotic platform at the Defense Advanced Research Projects Agency (DARPA) Subterranean Challenge, where the robot rapidly maps, navigates, and explores dynamic underground environments.\n
\n\n\n
\n\n\n
\n \n\n \n \n \n \n \n \n Trajectory Optimization for Wheeled-Legged Quadrupedal Robots Driving in Challenging Terrain.\n \n \n \n \n\n\n \n Medeiros, S. V.; Jelavic, E.; Bjelonic, M.; Siegwart, R.; Meggiolaro, A. M.; and Hutter, M.\n\n\n \n\n\n\n IEEE Robotics and Automation Letters. 2020.\n \n\n\n\n
\n\n\n\n \n \n \"Trajectory pdf\n  \n \n \n \"Trajectory video\n  \n \n \n \"Trajectory link\n  \n \n\n \n\n \n link\n  \n \n\n bibtex\n \n\n \n  \n \n abstract \n \n\n \n  \n \n 18 downloads\n \n \n\n \n \n \n \n \n \n \n\n  \n \n \n \n \n \n \n \n \n \n \n\n\n\n
\n
@article{medeiros2020trajectory,\n  author    = {Medeiros, S. Vivian and\n               Jelavic, Edo and\n               Bjelonic, Marko and\n               Siegwart, Roland and\n               Meggiolaro, A. Marco and\n               Hutter, Marco},\n  title     = {Trajectory Optimization for Wheeled-Legged Quadrupedal Robots\n               Driving in Challenging Terrain},\n  journal   = {IEEE Robotics and Automation Letters},\n  year      = {2020},\n  abstract  = {Wheeled-legged robots are an attractive solution for versatile\n               locomotion in challenging terrain. They combine the speed and\n               efficiency of wheels with the ability of legs to traverse\n               challenging terrain. In this paper, we present a trajectory\n               optimization formulation for wheeled-legged robots that\n               optimizes over the base and wheels' positions and forces and\n               takes into account the terrain information while computing the\n               plans. This enables us to find optimal driving motions over\n               challenging terrain. The robot is modeled as a single\n               rigid-body, which allows us to plan complex motions and still\n               keep a low computational complexity to solve the optimization\n               quickly. The terrain map, together with the use of a stability\n               constraint, allows the optimizer to generate feasible motions\n               that cannot be discovered without taking the terrain information\n               into account. The optimization is formulated as a Nonlinear\n               Programming (NLP) problem and the reference motions are tracked\n               by a hierarchical whole-body controller that computes the torque\n               actuation commands for the robot. The trajectories have been\n               experimentally verified on quadrupedal robot ANYmal equipped\n               with non-steerable torque-controlled wheels. Our trajectory\n               optimization framework enables wheeled quadrupedal robots to\n               drive over challenging terrain, e.g., steps, slopes, stairs,\n               while negotiating these obstacles with dynamic motions.},\n  keywords  = {legged robots, wheeled robots, motion planning,\n               optimization and optimal control},\n  url_pdf   = {files/2020_ral_medeiros.pdf},\n  url_video = {https://youtu.be/DlJGFhGS3HM},\n  url_link  = {https://ieeexplore.ieee.org/document/9079567},\n}\n\n
\n
\n\n\n
\n Wheeled-legged robots are an attractive solution for versatile locomotion in challenging terrain. They combine the speed and efficiency of wheels with the ability of legs to traverse challenging terrain. In this paper, we present a trajectory optimization formulation for wheeled-legged robots that optimizes over the base and wheels' positions and forces and takes into account the terrain information while computing the plans. This enables us to find optimal driving motions over challenging terrain. The robot is modeled as a single rigid-body, which allows us to plan complex motions and still keep a low computational complexity to solve the optimization quickly. The terrain map, together with the use of a stability constraint, allows the optimizer to generate feasible motions that cannot be discovered without taking the terrain information into account. The optimization is formulated as a Nonlinear Programming (NLP) problem and the reference motions are tracked by a hierarchical whole-body controller that computes the torque actuation commands for the robot. The trajectories have been experimentally verified on quadrupedal robot ANYmal equipped with non-steerable torque-controlled wheels. Our trajectory optimization framework enables wheeled quadrupedal robots to drive over challenging terrain, e.g., steps, slopes, stairs, while negotiating these obstacles with dynamic motions.\n
\n\n\n
\n\n\n
\n \n\n \n \n \n \n \n \n Perceptive Whole Body Planning for Multi-legged Robots in Confined Spaces.\n \n \n \n \n\n\n \n Buchanan, R.; Wellhausen, L.; Bjelonic, M.; Bandyopadhyay, T.; Kottege, N.; and Hutter, M.\n\n\n \n\n\n\n Journal of Field Robotics. 2020.\n \n\n\n\n
\n\n\n\n \n \n \"Perceptive pdf\n  \n \n \n \"Perceptive video\n  \n \n \n \"Perceptive link\n  \n \n\n \n \n doi\n  \n \n\n \n link\n  \n \n\n bibtex\n \n\n \n  \n \n abstract \n \n\n \n  \n \n 9 downloads\n \n \n\n \n \n \n \n \n \n \n\n  \n \n \n\n\n\n
\n
@article{buchanan2020perceptive,\n  author    = {Buchanan, Russel and\n               Wellhausen, Lorenz and\n               Bjelonic, Marko and\n               Bandyopadhyay, Tirthankar and\n               Kottege, Navinda and\n               Hutter, Marco},\n  title     = {Perceptive Whole Body Planning for\n               Multi-legged Robots in Confined Spaces},\n  journal   = {Journal of Field Robotics},\n  doi       = {10.1002/rob.21974},\n  year      = {2020},\n  abstract  = {Legged robots are exceedingly versatile and have the potential to navigate complex, confined spaces due\n               to their many degrees of freedom. As a result of the computational complexity , there exist no online\n               planners for perceptive whole body locomotion of robots in tight spaces. In this paper, we present a\n               new method for perceptive planning for multi-legged robots, which generates body poses, footholds, and\n               swing trajectories for collision avoidance. Measurements from an onboard depth camera are used to create\n               a 3D map of the terrain around the robot. We randomly sample body poses then smooth the resulting\n               trajectory while satisfying several constraints such as robot kinematics and collision avoidance.\n               Footholds and swing trajectories are computed based on the terrain, and the robot body pose is optimized\n               to ensure stable locomotion while not colliding with the environment. Our method is designed to run\n               online on a real robot and generate trajectories several meters long. We first tested our algorithm in\n               several simulations with varied confined spaces using the quadrupedal robot ANYmal. We also simulated\n               experiments with the hexapod robot Weaver to demonstrate applicability to different legged robot\n               configurations. Then, we demonstrated our whole body planner in several online experiments both indoors\n               and in realistic scenarios at an emergency rescue training facility. ANYmal, which has a nominal\n               standing height of 80 cm and width of 59 cm, navigated through several representative disaster areas\n               with openings as small as 60 cm. 3 m trajectories were re-planned with 500 ms update times.},\n  url_pdf   = {https://www.researchgate.net/publication/342078558_Perceptive_Whole_Body_Planning_for_Multi-legged_Robots_in_Confined_Spaces},\n  url_video = {https://youtu.be/C2e0JTdwid0},\n  url_link  = {https://www.researchgate.net/publication/342078558_Perceptive_Whole_Body_Planning_for_Multi-legged_Robots_in_Confined_Spaces}\n}\n\n
\n
\n\n\n
\n Legged robots are exceedingly versatile and have the potential to navigate complex, confined spaces due to their many degrees of freedom. As a result of the computational complexity , there exist no online planners for perceptive whole body locomotion of robots in tight spaces. In this paper, we present a new method for perceptive planning for multi-legged robots, which generates body poses, footholds, and swing trajectories for collision avoidance. Measurements from an onboard depth camera are used to create a 3D map of the terrain around the robot. We randomly sample body poses then smooth the resulting trajectory while satisfying several constraints such as robot kinematics and collision avoidance. Footholds and swing trajectories are computed based on the terrain, and the robot body pose is optimized to ensure stable locomotion while not colliding with the environment. Our method is designed to run online on a real robot and generate trajectories several meters long. We first tested our algorithm in several simulations with varied confined spaces using the quadrupedal robot ANYmal. We also simulated experiments with the hexapod robot Weaver to demonstrate applicability to different legged robot configurations. Then, we demonstrated our whole body planner in several online experiments both indoors and in realistic scenarios at an emergency rescue training facility. ANYmal, which has a nominal standing height of 80 cm and width of 59 cm, navigated through several representative disaster areas with openings as small as 60 cm. 3 m trajectories were re-planned with 500 ms update times.\n
\n\n\n
\n\n\n
\n \n\n \n \n \n \n \n \n Perceptive Locomotion in Rough Terrain – Online Foothold Optimization.\n \n \n \n \n\n\n \n Jenelten, F.; Miki, T.; Vijayan, A. E.; Bjelonic, M.; and Hutter, M.\n\n\n \n\n\n\n IEEE Robotics and Automation Letters, 5(4): 5370-5376. 2020.\n \n\n\n\n
\n\n\n\n \n \n \"Perceptive pdf\n  \n \n \n \"Perceptive video\n  \n \n \n \"Perceptive link\n  \n \n\n \n \n doi\n  \n \n\n \n link\n  \n \n\n bibtex\n \n\n \n  \n \n abstract \n \n\n \n  \n \n 5 downloads\n \n \n\n \n \n \n \n \n \n \n\n  \n \n \n\n\n\n
\n
@article{jenelten2020perceptive,\n  author    = {Jenelten, Fabian and\n               Miki, Takahiro and\n               Vijayan, Aravind E. and\n               Bjelonic, Marko and\n               Hutter, Marco},\n  title     = {Perceptive Locomotion in Rough Terrain – Online Foothold\n               Optimization},\n  journal   = {IEEE Robotics and Automation Letters},\n  volume    = {5},\n  number    = {4},\n  pages     = {5370-5376},\n  doi       = {10.1109/LRA.2020.3007427},\n  year      = {2020},\n  abstract  = {Compared to wheeled vehicles, legged systems have a vast\n               potential to traverse challenging terrain. To exploit the full\n               potential, it is crucial to tightly integrate terrain\n               perception for foothold planning. We present a hierarchical\n               locomotion planner together with a foothold optimizer that\n               finds locally optimal footholds within an elevation map. The\n               map is generated in real-time from on-board depth sensors. We\n               further propose a terrain-aware contact schedule to deal with\n               actuator velocity limits. We validate the combined locomotion\n               pipeline on our quadrupedal robot ANYmal with a variety of\n               simulated and real-world experiments. We show that our method\n               can cope with stairs and obstacles of heights up to 33% of the\n               robot's leg length.},\n  url_pdf   = {https://www.research-collection.ethz.ch/bitstream/handle/20.500.11850/425596/RAL20___Perceptive_Locomotion_In_Rough_Terrain.pdf?sequence=1&isAllowed=y},\n  url_video = {https://youtu.be/ViecsBmjusI},\n  url_link  = {https://ieeexplore.ieee.org/document/9134750}\n}\n\n
\n
\n\n\n
\n Compared to wheeled vehicles, legged systems have a vast potential to traverse challenging terrain. To exploit the full potential, it is crucial to tightly integrate terrain perception for foothold planning. We present a hierarchical locomotion planner together with a foothold optimizer that finds locally optimal footholds within an elevation map. The map is generated in real-time from on-board depth sensors. We further propose a terrain-aware contact schedule to deal with actuator velocity limits. We validate the combined locomotion pipeline on our quadrupedal robot ANYmal with a variety of simulated and real-world experiments. We show that our method can cope with stairs and obstacles of heights up to 33% of the robot's leg length.\n
\n\n\n
\n\n\n\n\n\n
\n
\n\n
\n
\n  \n 2019\n \n \n (5)\n \n \n
\n
\n \n \n
\n \n\n \n \n \n \n \n \n Trajectory Optimization for Wheeled-Legged Quadrupedal Robots using Linearized ZMP Constraints.\n \n \n \n \n\n\n \n de Viragh, Y.; Bjelonic, M.; Bellicoso, C. D; Jenelten, F.; and Hutter, M.\n\n\n \n\n\n\n IEEE Robotics and Automation Letters, 4(2): 1633-1640. 2019.\n \n\n\n\n
\n\n\n\n \n \n \"Trajectory pdf\n  \n \n \n \"Trajectory video\n  \n \n \n \"Trajectory link\n  \n \n\n \n \n doi\n  \n \n\n \n link\n  \n \n\n bibtex\n \n\n \n  \n \n abstract \n \n\n \n  \n \n 19 downloads\n \n \n\n \n \n \n \n \n \n \n\n  \n \n \n \n \n \n \n \n \n \n \n\n\n\n
\n
@article{deviragh2019trajectory,\n  author    = {de Viragh, Yvain and\n               Bjelonic, Marko and\n               Bellicoso, Carmine D and\n               Jenelten, Fabian and\n               Hutter, Marco},\n  title     = {Trajectory Optimization for Wheeled-Legged Quadrupedal Robots\n               using Linearized ZMP Constraints},\n  journal   = {IEEE Robotics and Automation Letters},\n  volume    = {4},\n  number    = {2},\n  pages     = {1633-1640},\n  year      = {2019},\n  doi       = {10.1109/LRA.2019.2896721},\n  abstract  = {We present a trajectory optimizer for quadrupedal\n               robots with actuated wheels. By solving for angular, vertical,\n               and planar components of the base and feet trajectories in a\n               cascaded fashion and by introducing a novel linear formulation of\n               the zero-moment point (ZMP) balance criterion, we rely on\n               quadratic programming only, thereby eliminating the need for\n               nonlinear optimization routines. Yet, even for gaits containing\n               full flight phases, we are able to generate trajectories for\n               executing complex motions that involve simultaneous driving,\n               walking, and turning. We verified our approach in simulations of\n               the quadrupedal robot ANYmal equipped with wheels, where we are\n               able to run the proposed trajectory optimizer at 50 Hz. To the\n               best of our knowledge, this is the first time that such dynamic\n               motions are demonstrated for wheeled-legged quadrupedal robots\n               using an online motion planner.},\n  keywords  = {legged robots, wheeled robots,\n               motion and path planning, optimization and optimal control},\n  url_pdf   = {files/2019_ral_de_viragh.pdf},\n  url_video = {https://youtu.be/I1aTCTc0J4U},\n  url_link  = {https://ieeexplore.ieee.org/document/8630448}\n}\n\n
\n
\n\n\n
\n We present a trajectory optimizer for quadrupedal robots with actuated wheels. By solving for angular, vertical, and planar components of the base and feet trajectories in a cascaded fashion and by introducing a novel linear formulation of the zero-moment point (ZMP) balance criterion, we rely on quadratic programming only, thereby eliminating the need for nonlinear optimization routines. Yet, even for gaits containing full flight phases, we are able to generate trajectories for executing complex motions that involve simultaneous driving, walking, and turning. We verified our approach in simulations of the quadrupedal robot ANYmal equipped with wheels, where we are able to run the proposed trajectory optimizer at 50 Hz. To the best of our knowledge, this is the first time that such dynamic motions are demonstrated for wheeled-legged quadrupedal robots using an online motion planner.\n
\n\n\n
\n\n\n
\n \n\n \n \n \n \n \n \n Walking Posture Adaptation for Legged Robot Navigation in Confined Spaces.\n \n \n \n \n\n\n \n Buchanan, R.; Bandyopadhyay, T.; Bjelonic, M.; Wellhausen, L.; Hutter, M.; and Kottege, N.\n\n\n \n\n\n\n IEEE Robotics and Automation Letters, 4(2): 2148-2155. 2019.\n \n\n\n\n
\n\n\n\n \n \n \"Walking pdf\n  \n \n \n \"Walking link\n  \n \n\n \n \n doi\n  \n \n\n \n link\n  \n \n\n bibtex\n \n\n \n  \n \n abstract \n \n\n \n  \n \n 6 downloads\n \n \n\n \n \n \n \n \n \n \n\n  \n \n \n \n \n \n \n\n\n\n
\n
@article{buchanan2019walking,\n  author    = {Buchanan, Russel and\n               Bandyopadhyay, Tirthankar and\n               Bjelonic, Marko and\n               Wellhausen, Lorenz and\n               Hutter, Marco and\n               Kottege, Navinda},\n  title     = {Walking Posture Adaptation for Legged Robot Navigation in\n               Confined Spaces},\n  journal   = {IEEE Robotics and Automation Letters},\n  volume    = {4},\n  number    = {2},\n  pages     = {2148-2155},\n  doi       = {10.1109/LRA.2019.2899664},\n  year      = {2019},\n  abstract  = {Legged robots have the ability to adapt their walking posture to\n               navigate confined spaces due to their high degrees of freedom.\n               However, this has not been exploited in most common multilegged\n               platforms. This paper presents a deformable bounding box\n               abstraction of the robot model, with accompanying mapping and\n               planning strategies, that enable a legged robot to autonomously\n               change its body shape to navigate confined spaces. The mapping\n               is achieved using robot-centric multi-elevation maps generated\n               with distance sensors carried by the robot. The path planning is\n               based on the trajectory optimisation algorithm CHOMP which\n               creates smooth trajectories while avoiding obstacles. The\n               proposed method has been tested in simulation and implemented on\n               the hexapod robot Weaver, which is 33 cm tall and 82 cm wide\n               when walking normally. We demonstrate navigating under 25 cm\n               overhanging obstacles, through 70 cm wide gaps and over 22 cm\n               high obstacles in both artificial testing spaces and realistic\n               environments, including a subterranean mining tunnel.},\n  keywords  = {legged robots, motion control},\n  url_pdf   = {files/2019_ral_buchanan.pdf},\n  url_link  = {https://ieeexplore.ieee.org/document/8642939}\n}\n\n
\n
\n\n\n
\n Legged robots have the ability to adapt their walking posture to navigate confined spaces due to their high degrees of freedom. However, this has not been exploited in most common multilegged platforms. This paper presents a deformable bounding box abstraction of the robot model, with accompanying mapping and planning strategies, that enable a legged robot to autonomously change its body shape to navigate confined spaces. The mapping is achieved using robot-centric multi-elevation maps generated with distance sensors carried by the robot. The path planning is based on the trajectory optimisation algorithm CHOMP which creates smooth trajectories while avoiding obstacles. The proposed method has been tested in simulation and implemented on the hexapod robot Weaver, which is 33 cm tall and 82 cm wide when walking normally. We demonstrate navigating under 25 cm overhanging obstacles, through 70 cm wide gaps and over 22 cm high obstacles in both artificial testing spaces and realistic environments, including a subterranean mining tunnel.\n
\n\n\n
\n\n\n
\n \n\n \n \n \n \n \n \n Keep Rollin'- Whole-Body Motion Control and Planning for Wheeled Quadrupedal Robots.\n \n \n \n \n\n\n \n Bjelonic, M.; Bellicoso, C. D; de Viragh, Y.; Sako, D.; Tresoldi, F D.; Jenelten, F.; and Hutter, M.\n\n\n \n\n\n\n IEEE Robotics and Automation Letters, 4(2): 2116-2123. 2019.\n \n\n\n\n
\n\n\n\n \n \n \"Keep pdf\n  \n \n \n \"Keep video\n  \n \n \n \"Keep link\n  \n \n\n \n \n doi\n  \n \n\n \n link\n  \n \n\n bibtex\n \n\n \n  \n \n abstract \n \n\n \n  \n \n 26 downloads\n \n \n\n \n \n \n \n \n \n \n\n  \n \n \n \n \n \n \n \n \n \n \n \n \n\n\n\n
\n
@article{bjelonic2019keep,\n  author    = {Bjelonic, Marko and\n               Bellicoso, Carmine D and\n               de Viragh, Yvain and\n               Sako, Dhionis and\n               Tresoldi, F Dante and\n               Jenelten, Fabian and\n               Hutter, Marco},\n  title     = {Keep Rollin'- Whole-Body Motion Control and Planning for Wheeled\n               Quadrupedal Robots},\n  journal   = {IEEE Robotics and Automation Letters},\n  volume    = {4},\n  number    = {2},\n  pages     = {2116-2123},\n  doi       = {10.1109/LRA.2019.2899750},\n  year      = {2019},\n  abstract  = {We show dynamic locomotion strategies for wheeled quadrupedal\n               robots, which combine the advantages of both walking and\n               driving. The developed optimization framework tightly integrates\n               the additional degrees of freedom introduced by the wheels. Our\n               approach relies on a zero-moment point based motion optimization\n               which continuously updates reference trajectories. The reference\n               motions are tracked by a hierarchical whole-body controller\n               which computes optimal generalized accelerations and contact\n               forces by solving a sequence of prioritized tasks including the\n               nonholonomic rolling constraints. Our approach has been tested\n               on ANYmal, a quadrupedal robot that is fully torque-controlled\n               including the non-steerable wheels attached to its legs. We\n               conducted experiments on flat and inclined terrains as well as\n               over steps, whereby we show that integrating the wheels into the\n               motion control and planning framework results in intuitive\n               motion trajectories, which enable more robust and dynamic\n               locomotion compared to other wheeled-legged robots. Moreover,\n               with a speed of 4 m/s and a reduction of the cost of transport\n               by 83 % we prove the superiority of wheeled-legged robots\n               compared to their legged counterparts.},\n  keywords  = {legged robots, wheeled robots, motion control,\n               motion and path planning, optimization and optimal control},\n  url_pdf   = {files/2019_ral_bjelonic.pdf},\n  url_video = {https://youtu.be/nGLUsyx9Vvc},\n  url_link  = {https://ieeexplore.ieee.org/document/8642912},\n}\n\n
\n
\n\n\n
\n We show dynamic locomotion strategies for wheeled quadrupedal robots, which combine the advantages of both walking and driving. The developed optimization framework tightly integrates the additional degrees of freedom introduced by the wheels. Our approach relies on a zero-moment point based motion optimization which continuously updates reference trajectories. The reference motions are tracked by a hierarchical whole-body controller which computes optimal generalized accelerations and contact forces by solving a sequence of prioritized tasks including the nonholonomic rolling constraints. Our approach has been tested on ANYmal, a quadrupedal robot that is fully torque-controlled including the non-steerable wheels attached to its legs. We conducted experiments on flat and inclined terrains as well as over steps, whereby we show that integrating the wheels into the motion control and planning framework results in intuitive motion trajectories, which enable more robust and dynamic locomotion compared to other wheeled-legged robots. Moreover, with a speed of 4 m/s and a reduction of the cost of transport by 83 % we prove the superiority of wheeled-legged robots compared to their legged counterparts.\n
\n\n\n
\n\n\n
\n \n\n \n \n \n \n \n \n ALMA-Articulated Locomotion and Manipulation for a Torque-Controllable Robot.\n \n \n \n \n\n\n \n Bellicoso, C. D; Kramer, K.; Stäuble, M.; Sako, D.; Jenelten, F.; Bjelonic, M.; and Hutter, M.\n\n\n \n\n\n\n IEEE International Conference on Robotics and Automation (ICRA). 2019.\n \n\n\n\n
\n\n\n\n \n \n \"ALMA-Articulated pdf\n  \n \n \n \"ALMA-Articulated video\n  \n \n\n \n\n \n link\n  \n \n\n bibtex\n \n\n \n  \n \n abstract \n \n\n \n  \n \n 4 downloads\n \n \n\n \n \n \n \n \n \n \n\n  \n \n \n \n \n \n \n \n \n\n\n\n
\n
@article{bellicoso2019alma,\n  author    = {Bellicoso, Carmine D and\n               Kramer, Koen and\n               St{\\"a}uble, Markus and\n               Sako, Dhionis and\n               Jenelten, Fabian and\n               Bjelonic, Marko and\n               Hutter, Marco},\n  title     = {ALMA-Articulated Locomotion and Manipulation for a\n               Torque-Controllable Robot},\n  journal   = {IEEE International Conference on Robotics and Automation (ICRA)},\n  year      = {2019},\n  abstract  = {The task of robotic mobile manipulation poses several scientific\n               challenges that need to be addressed to execute complex\n               manipulation tasks in unstructured environments, in which\n               collaboration with humans might be required. Therefore, we\n               present ALMA, a motion planning and control framework for a\n               torque-controlled quadrupedal robot equipped with a six degrees\n               of freedom robotic arm capable of performing dynamic locomotion\n               while executing manipulation tasks. The online motion planning\n               framework, together with a whole-body controller based on a\n               hierarchical optimization algorithm, enables the system to walk,\n               trot and pace while executing operational space end-effector\n               control, reactive human-robot collaboration and torso posture\n               optimization to increase the arm's workspace. The torque control\n               of the whole system enables the implementation of compliant\n               behavior, allowing a user to safely interact with the robot.\n               We verify our framework on the real robot by performing tasks\n               such as opening a door and carrying a payload together with a\n               human.},\n  keywords  = {legged robots, mobile manipulation, optimization and\n               optimal control},\n  url_pdf   = {files/2019_icra_bellicoso.pdf},\n  url_video = {https://youtu.be/XrcLXX4AEWE},\n}\n\n
\n
\n\n\n
\n The task of robotic mobile manipulation poses several scientific challenges that need to be addressed to execute complex manipulation tasks in unstructured environments, in which collaboration with humans might be required. Therefore, we present ALMA, a motion planning and control framework for a torque-controlled quadrupedal robot equipped with a six degrees of freedom robotic arm capable of performing dynamic locomotion while executing manipulation tasks. The online motion planning framework, together with a whole-body controller based on a hierarchical optimization algorithm, enables the system to walk, trot and pace while executing operational space end-effector control, reactive human-robot collaboration and torso posture optimization to increase the arm's workspace. The torque control of the whole system enables the implementation of compliant behavior, allowing a user to safely interact with the robot. We verify our framework on the real robot by performing tasks such as opening a door and carrying a payload together with a human.\n
\n\n\n
\n\n\n
\n \n\n \n \n \n \n \n \n Trajectory Optimization for Wheeled Quadrupedal Robots Driving in Challenging Terrain.\n \n \n \n \n\n\n \n Medeiros, S. V.; Bjelonic, M.; Jelavic, E.; Siegwart, R.; Meggiolaro, A. M.; and Hutter, M.\n\n\n \n\n\n\n International Symposium on Adaptive Motion of Animals and Machines (AMAM). 2019.\n \n\n\n\n
\n\n\n\n \n \n \"Trajectory pdf\n  \n \n \n \"Trajectory video\n  \n \n\n \n\n \n link\n  \n \n\n bibtex\n \n\n \n  \n \n abstract \n \n\n \n  \n \n 8 downloads\n \n \n\n \n \n \n \n \n \n \n\n  \n \n \n \n \n \n \n \n \n\n\n\n
\n
@article{medeiros2019trajectory,\n  author    = {Medeiros, S. Vivian and\n               Bjelonic, Marko and\n               Jelavic, Edo and\n               Siegwart, Roland and\n               Meggiolaro, A. Marco and\n               Hutter, Marco},\n  title     = {Trajectory Optimization for Wheeled Quadrupedal Robots Driving\n               in Challenging Terrain},\n  journal   = {International Symposium on Adaptive Motion of Animals and Machines (AMAM)},\n  year      = {2019},\n  abstract  = {We present a trajectory optimizer for driving motions for\n               wheeled-legged quadrupedal robots with actuated wheels.\n               A simplified two-dimensional Single Rigid Body model is\n               used, which allows for fast solutions even for trajectories\n               with a long time horizon. Since the planner has knowledge\n               of the terrain and performs the optimization over the wheels’\n               contact forces as well as its positions, the robot is able to tra-\n               verse challenging terrain, including driving up a step, which\n               to the best of our knowledge was never shown before.},\n  keywords  = {legged robots, wheeled robots, trajectory optimization},\n  url_pdf   = {files/2019_amam_medeiros.pdf},\n  url_video = {https://youtu.be/lELr4stekhQ},\n}\n\n
\n
\n\n\n
\n We present a trajectory optimizer for driving motions for wheeled-legged quadrupedal robots with actuated wheels. A simplified two-dimensional Single Rigid Body model is used, which allows for fast solutions even for trajectories with a long time horizon. Since the planner has knowledge of the terrain and performs the optimization over the wheels’ contact forces as well as its positions, the robot is able to tra- verse challenging terrain, including driving up a step, which to the best of our knowledge was never shown before.\n
\n\n\n
\n\n\n\n\n\n
\n
\n\n
\n
\n  \n 2018\n \n \n (6)\n \n \n
\n
\n \n \n
\n \n\n \n \n \n \n \n \n Towards a generic Solution for Inspection of Industrial Sites.\n \n \n \n \n\n\n \n Hutter, M.; Diethelm, R.; Bachmann, S.; Fankhauser, P.; Gehring, C.; Tsounis, V.; Lauber, A.; Guenther, F.; Bjelonic, M.; Isler, L.; and others\n\n\n \n\n\n\n In Field and Service Robotics, pages 575–589, 2018. \n \n\n\n\n
\n\n\n\n \n \n \"Towards video\n  \n \n \n \"Towards link\n  \n \n\n \n \n doi\n  \n \n\n \n link\n  \n \n\n bibtex\n \n\n \n  \n \n abstract \n \n\n \n  \n \n 1 download\n \n \n\n \n \n \n \n \n \n \n\n  \n \n \n \n \n \n \n \n \n \n \n \n \n\n\n\n
\n
@inproceedings{hutter2018towards,\n  author    = {Hutter, Marco and\n               Diethelm, Remo and\n               Bachmann, Samuel and\n               Fankhauser, Peter and\n               Gehring, Christian and\n               Tsounis, Vassilios and\n               Lauber, Andreas and\n               Guenther, Fabian and\n               Bjelonic, Marko and\n               Isler, Linus and\n               others},\n  title     = {Towards a generic Solution for Inspection of Industrial Sites},\n  booktitle = {Field and Service Robotics},\n  year      = {2018},\n  pages     = {575--589},\n  doi       = {10.3929/ethz-b-000183150},\n  abstract  = {Autonomous robotic inspection of industrial sites offers a huge\n               potential with respect to increasing human safety and operational\n               efficiency. The present paper provides an insight into the\n               approach taken by team LIO during the ARGOS Challenge. In this\n               international competition, the legged robot ANYmal was equipped\n               with a sensor head to perform visual, acoustic, and thermal\n               inspection on an oil and gas site. The robot was able to\n               autonomously navigate on the outdoor industrial facilty using\n               rotating line-LIDAR sensors for localization and terrain mapping.\n               Thanks to the superior mobility of legged robots, ANYmal can\n               omni-directionally move with statically and dynamically stable\n               gaits while overcoming large obstacles and stairs. Moreover, the\n               versatile machine can adapt its posture for inspection. The paper\n               additionally provides insight into the methods applied for visual\n               inspection of pressure gauges and concludes with some insight\n               into the general learnings from the ARGOS Challenge.},\n  keywords  = {legged robot, quadruped robot, field robotics,\n               series elastic actuation, autonomous navigation},\n  url_video = {https://youtu.be/2RQDp0Q2vSo},\n  url_link  = {https://doi.org/10.3929/ethz-b-000183150},\n}\n\n
\n
\n\n\n
\n Autonomous robotic inspection of industrial sites offers a huge potential with respect to increasing human safety and operational efficiency. The present paper provides an insight into the approach taken by team LIO during the ARGOS Challenge. In this international competition, the legged robot ANYmal was equipped with a sensor head to perform visual, acoustic, and thermal inspection on an oil and gas site. The robot was able to autonomously navigate on the outdoor industrial facilty using rotating line-LIDAR sensors for localization and terrain mapping. Thanks to the superior mobility of legged robots, ANYmal can omni-directionally move with statically and dynamically stable gaits while overcoming large obstacles and stairs. Moreover, the versatile machine can adapt its posture for inspection. The paper additionally provides insight into the methods applied for visual inspection of pressure gauges and concludes with some insight into the general learnings from the ARGOS Challenge.\n
\n\n\n
\n\n\n
\n \n\n \n \n \n \n \n \n Robust Rough-Terrain Locomotion with a Quadrupedal Robot.\n \n \n \n \n\n\n \n Fankhauser, P.; Bjelonic, M.; Bellicoso, C. D; Miki, T.; and Hutter, M.\n\n\n \n\n\n\n In IEEE International Conference on Robotics and Automation (ICRA), 2018. \n \n\n\n\n
\n\n\n\n \n \n \"Robust video\n  \n \n \n \"Robust link\n  \n \n\n \n \n doi\n  \n \n\n \n link\n  \n \n\n bibtex\n \n\n \n  \n \n abstract \n \n\n \n  \n \n 4 downloads\n \n \n\n \n \n \n \n \n \n \n\n  \n \n \n \n \n \n \n \n \n \n \n\n\n\n
\n
@inproceedings{fankhauser2018robust,\n  author    = {Fankhauser, P{\\'e}ter and\n               Bjelonic, Marko and\n               Bellicoso, Carmine D and\n               Miki, Takahiro and\n               Hutter, Marco},\n  title     = {Robust Rough-Terrain Locomotion with a Quadrupedal Robot},\n  booktitle = {IEEE International Conference on Robotics and Automation\n              (ICRA)},\n  year      = {2018},\n  doi       = {10.1109/ICRA.2018.8460731},\n  abstract  = {Robots working in natural, urban, and industrial settings need to\n               be able to navigate challenging environments. In this paper, we\n               present a motion planner for the perceptive rough-terrain\n               locomotion with quadrupedal robots. The planner finds safe\n               footholds along with collision-free swing-leg motions by\n               leveraging an acquired terrain map. To this end, we present a\n               novel pose optimization approach that enables the robot to climb\n               over significant obstacles. We experimentally validate our\n               approach with the quadrupedal robot ANYmal by autonomously\n               traversing obstacles such steps, inclines, and stairs. The\n               locomotion planner re-plans the motion at every step to cope with\n               disturbances and dynamic environments. The robot has no prior\n               knowledge of the scene, and all mapping, state estimation,\n               control, and planning is performed in real-time onboard the\n               robot.},\n  keywords  = {legged locomotion, robot sensing systems, planning,\n               collision avoidance},\n  url_video = {https://youtu.be/CpzQu25iLa0},\n  url_link  = {https://ieeexplore.ieee.org/document/8460731},\n}\n\n
\n
\n\n\n
\n Robots working in natural, urban, and industrial settings need to be able to navigate challenging environments. In this paper, we present a motion planner for the perceptive rough-terrain locomotion with quadrupedal robots. The planner finds safe footholds along with collision-free swing-leg motions by leveraging an acquired terrain map. To this end, we present a novel pose optimization approach that enables the robot to climb over significant obstacles. We experimentally validate our approach with the quadrupedal robot ANYmal by autonomously traversing obstacles such steps, inclines, and stairs. The locomotion planner re-plans the motion at every step to cope with disturbances and dynamic environments. The robot has no prior knowledge of the scene, and all mapping, state estimation, control, and planning is performed in real-time onboard the robot.\n
\n\n\n
\n\n\n
\n \n\n \n \n \n \n \n \n Weaver: Hexapod Robot for Autonomous Navigation on Unstructured Terrain.\n \n \n \n \n\n\n \n Bjelonic, M.; Kottege, N.; Homberger, T.; Borges, P.; Beckerle, P.; and Chli, M.\n\n\n \n\n\n\n Journal of Field Robotics,1063–1079. 2018.\n \n\n\n\n
\n\n\n\n \n \n \"Weaver: pdf\n  \n \n \n \"Weaver: video\n  \n \n \n \"Weaver: link\n  \n \n\n \n \n doi\n  \n \n\n \n link\n  \n \n\n bibtex\n \n\n \n  \n \n abstract \n \n\n \n  \n \n 3 downloads\n \n \n\n \n \n \n \n \n \n \n\n  \n \n \n \n \n \n \n \n \n \n \n \n \n\n\n\n
\n
@article{bjelonic2018weaver,\n  author    = {Bjelonic, Marko and\n               Kottege, Navinda and\n               Homberger, Timon and\n               Borges, Paulo and\n               Beckerle, Philipp and\n               Chli, Margarita},\n  title     = {Weaver: Hexapod Robot for Autonomous Navigation on Unstructured\n               Terrain},\n  journal   = {Journal of Field Robotics},\n  year      = {2018},\n  pages     = {1063--1079},\n  doi       = {10.1002/rob.21795},\n  abstract  = {Legged robots are an efficient alternative for navigation in\n               challenging terrain. In this paper we describe Weaver, a\n               six‐legged robot that is designed to perform autonomous\n               navigation in unstructured terrain. It uses stereo vision and\n               proprioceptive sensing based terrain perception for adaptive\n               control while using visual‐inertial odometry for autonomous\n               waypoint‐based navigation. Terrain perception generates a minimal\n               representation of the traversed environment in terms of roughness\n               and step height. This reduces the complexity of the terrain\n               model significantly, enabling the robot to feed back information\n               about the environment into its controller. Furthermore, we\n               combine exteroceptive and proprioceptive sensing to enhance the\n               terrain perception capabilities, especially in situations in\n               which the stereo camera is not able to generate an accurate\n               representation of the environment. The adaptation approach\n               described also exploits the unique properties of legged robots\n               by adapting the virtual stiffness, stride frequency, and stride\n               height. Weaver's unique leg design with five joints per leg\n               improves locomotion on high gradient slopes, and this novel\n               configuration is further analyzed. Using these approaches, we\n               present an experimental evaluation of this fully self‐contained\n               hexapod performing autonomous navigation on a multiterrain\n               testbed and in outdoor terrain.},\n  keywords  = {legged robot, hexapedal robot, rough terrain, terain perception,\n               control adaptation},\n  url_pdf   = {files/2018_jfr_bjelonic.pdf},\n  url_video = {https://youtu.be/eLMUiX96En0},\n  url_link  = {https://doi.org/10.1002/rob.21795},\n}\n\n
\n
\n\n\n
\n Legged robots are an efficient alternative for navigation in challenging terrain. In this paper we describe Weaver, a six‐legged robot that is designed to perform autonomous navigation in unstructured terrain. It uses stereo vision and proprioceptive sensing based terrain perception for adaptive control while using visual‐inertial odometry for autonomous waypoint‐based navigation. Terrain perception generates a minimal representation of the traversed environment in terms of roughness and step height. This reduces the complexity of the terrain model significantly, enabling the robot to feed back information about the environment into its controller. Furthermore, we combine exteroceptive and proprioceptive sensing to enhance the terrain perception capabilities, especially in situations in which the stereo camera is not able to generate an accurate representation of the environment. The adaptation approach described also exploits the unique properties of legged robots by adapting the virtual stiffness, stride frequency, and stride height. Weaver's unique leg design with five joints per leg improves locomotion on high gradient slopes, and this novel configuration is further analyzed. Using these approaches, we present an experimental evaluation of this fully self‐contained hexapod performing autonomous navigation on a multiterrain testbed and in outdoor terrain.\n
\n\n\n
\n\n\n
\n \n\n \n \n \n \n \n \n Skating with a Force Controlled Quadrupedal Robot.\n \n \n \n \n\n\n \n Bjelonic, M.; Bellicoso, C. D; Tiryaki, M E.; and Hutter, M.\n\n\n \n\n\n\n In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 7555–7561, 2018. \n \n\n\n\n
\n\n\n\n \n \n \"Skating pdf\n  \n \n \n \"Skating video\n  \n \n \n \"Skating link\n  \n \n\n \n \n doi\n  \n \n\n \n link\n  \n \n\n bibtex\n \n\n \n  \n \n abstract \n \n\n \n  \n \n 6 downloads\n \n \n\n \n \n \n \n \n \n \n\n  \n \n \n \n \n \n \n \n \n \n \n \n \n\n\n\n
\n
@inproceedings{bjelonic2018skating,\n  author    = {Bjelonic, Marko and\n               Bellicoso, Carmine D and\n               Tiryaki, M Efe and\n               Hutter, Marco},\n  title     = {Skating with a Force Controlled Quadrupedal Robot},\n  booktitle = {IEEE/RSJ International Conference on Intelligent Robots and\n               Systems (IROS)},\n  year      = {2018},\n  pages     = {7555--7561},\n  doi       = {10.1109/IROS.2018.8594504},\n  abstract  = {Traditional legged robots are capable of traversing challenging\n               terrain, but lack of energy efficiency when compared to wheeled\n               systems operating on flat environments. The combination of both\n               locomotion domains overcomes the trade-off between mobility and\n               efficiency. Therefore, this paper presents a novel motion planner\n               and controller which together enable a legged robot equipped with\n               skates to perform skating maneuvers. These are achieved by an\n               appropriate combination of planned reaction forces and gliding\n               motions. Our novel motion controller formulates a Virtual Model\n               Controller and an optimal contact force distribution which takes\n               into account the nonholonomic constraints introduced by the\n               skates. This approach has been tested on the torque-controllable\n               robot ANYmal equipped with passive wheels and ice skates as\n               end-effectors. We conducted experiments on flat and inclined\n               terrain, whereby we show that skating motions reduces the cost\n               of transport by up to 80{\\,\\%} with respect to traditional walking\n               gaits.},\n  keywords  = {legged locomotion, quadrupedal robot, skating, motion control,\n               force control},\n  url_pdf   = {files/2018_iros_bjelonic.pdf},\n  url_video = {https://youtu.be/fJfAWiylpxw},\n  url_link  = {https://ieeexplore.ieee.org/document/8594504}\n}\n\n
\n
\n\n\n
\n Traditional legged robots are capable of traversing challenging terrain, but lack of energy efficiency when compared to wheeled systems operating on flat environments. The combination of both locomotion domains overcomes the trade-off between mobility and efficiency. Therefore, this paper presents a novel motion planner and controller which together enable a legged robot equipped with skates to perform skating maneuvers. These are achieved by an appropriate combination of planned reaction forces and gliding motions. Our novel motion controller formulates a Virtual Model Controller and an optimal contact force distribution which takes into account the nonholonomic constraints introduced by the skates. This approach has been tested on the torque-controllable robot ANYmal equipped with passive wheels and ice skates as end-effectors. We conducted experiments on flat and inclined terrain, whereby we show that skating motions reduces the cost of transport by up to 80\\,% with respect to traditional walking gaits.\n
\n\n\n
\n\n\n
\n \n\n \n \n \n \n \n \n An Adaptive Landing Gear for Extending the Operational Range of Helicopters.\n \n \n \n \n\n\n \n Stolz, B.; Brödermann, T.; Castiello, E.; Engelberger, G.; Erne, D.; Gasser, J.; Hayoz, E.; Müller, S.; Mühlebach, L.; Löw, T.; Scheuer, D.; Vendeventer, L.; Bjelonic, M.; and others\n\n\n \n\n\n\n In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 1757-1763, 2018. \n \n\n\n\n
\n\n\n\n \n \n \"An pdf\n  \n \n \n \"An video\n  \n \n \n \"An link\n  \n \n\n \n \n doi\n  \n \n\n \n link\n  \n \n\n bibtex\n \n\n \n  \n \n abstract \n \n\n \n  \n \n 1 download\n \n \n\n \n \n \n \n \n \n \n\n  \n \n \n \n \n \n \n \n \n \n \n \n \n\n\n\n
\n
@inproceedings{stolz2018adaptive,\n  author    = {Stolz, Boris and\n               Br{\\"o}dermann, Tim and\n               Castiello, Enea and\n               Engelberger, Gokula and\n               Erne, Daniel and\n               Gasser, Jan and\n               Hayoz, Eric and\n               M{\\"u}ller, Stephan and\n               M{\\"u}hlebach, Lorin and\n               L{\\"o}w, Tobias and\n               Scheuer, Dominique and\n               Vendeventer, Luca and\n               Bjelonic, Marko and\n               others},\n  title     = {An Adaptive Landing Gear for Extending the Operational Range of\n               Helicopters},\n  booktitle = {IEEE/RSJ International Conference on Intelligent Robots and\n               Systems (IROS)},\n  year      = {2018},\n  pages     = {1757-1763},\n  doi       = {10.1109/IROS.2018.8594062},\n  abstract  = {Conventional skid or wheel based helicopter landing gears\n               severely limit off-field landing possibilities, which are crucial\n               when operating in scenarios such as mountain rescue. In this\n               context, slopes beyond 8 ◦ and small obstacles can already pose a\n               substantial hazard. An adaptive landing gear is proposed to\n               overcome these limitations. It consists of four legs with one\n               degree of freedom each. The total weight was minimized to\n               demonstrate economic practicability. This was achieved by an\n               innovative actuation, composed of a parallel arrangement of motor\n               and brake, which relieves the motor from large impact loads\n               during hard landings. The loads are alleviated by a spring-damper\n               system acting in series to the actuation. Each leg is\n               individually force controlled for optimal load distribution on\n               compliant ground and to avoid tipping. The operation of the legs\n               is fully autonomous during the landing phase. A prototype was\n               designed and successfully tested on an unmanned helicopter with a\n               maximum take-off weight of 78 kg. Finally, the implementation of\n               the landing gear concept on aircraft of various scales was\n               discussed.},\n  keywords  = {legged locomotion, quadrupedal robot, skating, motion control,\n               force control},\n  url_pdf   = {files/2018_iros_stolz.pdf},\n  url_video = {https://youtu.be/JtoOWS18D3k},\n  url_link  = {https://ieeexplore.ieee.org/document/8594062}\n}\n\n
\n
\n\n\n
\n Conventional skid or wheel based helicopter landing gears severely limit off-field landing possibilities, which are crucial when operating in scenarios such as mountain rescue. In this context, slopes beyond 8 ◦ and small obstacles can already pose a substantial hazard. An adaptive landing gear is proposed to overcome these limitations. It consists of four legs with one degree of freedom each. The total weight was minimized to demonstrate economic practicability. This was achieved by an innovative actuation, composed of a parallel arrangement of motor and brake, which relieves the motor from large impact loads during hard landings. The loads are alleviated by a spring-damper system acting in series to the actuation. Each leg is individually force controlled for optimal load distribution on compliant ground and to avoid tipping. The operation of the legs is fully autonomous during the landing phase. A prototype was designed and successfully tested on an unmanned helicopter with a maximum take-off weight of 78 kg. Finally, the implementation of the landing gear concept on aircraft of various scales was discussed.\n
\n\n\n
\n\n\n
\n \n\n \n \n \n \n \n \n Advances in Real-World Applications for Legged Robots.\n \n \n \n \n\n\n \n Bellicoso, C. D; Bjelonic, M.; Wellhausen, L.; Sako, D.; Holtmann, K.; Guenther, F.; Tranzatto, M.; Fankhauser, P.; and Hutter, M.\n\n\n \n\n\n\n Journal of Field Robotics, 35(8): 1311-1326. 2018.\n \n\n\n\n
\n\n\n\n \n \n \"Advances pdf\n  \n \n \n \"Advances video\n  \n \n \n \"Advances link\n  \n \n\n \n \n doi\n  \n \n\n \n link\n  \n \n\n bibtex\n \n\n \n  \n \n abstract \n \n\n \n  \n \n 2 downloads\n \n \n\n \n \n \n \n \n \n \n\n  \n \n \n \n \n \n \n \n \n \n \n \n \n\n\n\n
\n
@article{bellicoso2018advances,\n  author    = {Bellicoso, Carmine D and\n               Bjelonic, Marko and\n               Wellhausen, Lorenz and\n               Sako, Dhionis and\n               Holtmann, Kai and\n               Guenther, Fabian and\n               Tranzatto, Marco and\n               Fankhauser, P{\\'e}ter and\n               Hutter, Marco},\n  title     = {Advances in Real-World Applications for Legged Robots},\n  journal   = {Journal of Field Robotics},\n  volume    = {35},\n  number    = {8},\n  pages     = {1311-1326},\n  doi       = {10.1002/rob.21839},\n  year      = {2018},\n  abstract  = {This paper provides insight into the application of the\n               quadrupedal robot ANYmal in outdoor missions of industrial\n               inspection (ARGOS Challenge) and search and rescue (European\n               Robotics League (ERL)  Emergency Robots). In both competitions,\n               the legged robot had to autonomously and semi-autonomously\n               navigate in real-world scenarios to complete high-level tasks\n               such as inspection and payload delivery. In the ARGOS\n               competition, ANYmal used a rotating LiDAR sensor to\n               localize on the industrial site and map the terrain and obstacles\n               around the robot. In the ERL competition, additional Real-Time\n               Kinematic (RTK)-Global Positioning System (GPS) was used to\n               co-localize the legged robot with respect to a Micro Aerial\n               Vehicle (MAV) that creates maps from the aerial view. The high\n               mobility of legged robots allows overcoming large obstacles, e.g.\n               steps and stairs, with statically and dynamically stable gaits.\n               Moreover, the versatile machine can adapt its posture for\n               inspection and payload delivery. The paper concludes with insight\n               into the general learnings from the ARGOS and ERL challenges.},\n  keywords  = {legged robot, quadrupedal robot, localization, mapping,\n               challenge},\n  url_pdf   = {files/2018_jfr_bellicoso.pdf},\n  url_video = {https://youtu.be/qrJlMze_xhQ},\n  url_link  = {https://doi.org/10.1002/rob.21839}\n}\n\n
\n
\n\n\n
\n This paper provides insight into the application of the quadrupedal robot ANYmal in outdoor missions of industrial inspection (ARGOS Challenge) and search and rescue (European Robotics League (ERL) Emergency Robots). In both competitions, the legged robot had to autonomously and semi-autonomously navigate in real-world scenarios to complete high-level tasks such as inspection and payload delivery. In the ARGOS competition, ANYmal used a rotating LiDAR sensor to localize on the industrial site and map the terrain and obstacles around the robot. In the ERL competition, additional Real-Time Kinematic (RTK)-Global Positioning System (GPS) was used to co-localize the legged robot with respect to a Micro Aerial Vehicle (MAV) that creates maps from the aerial view. The high mobility of legged robots allows overcoming large obstacles, e.g. steps and stairs, with statically and dynamically stable gaits. Moreover, the versatile machine can adapt its posture for inspection and payload delivery. The paper concludes with insight into the general learnings from the ARGOS and ERL challenges.\n
\n\n\n
\n\n\n\n\n\n
\n
\n\n
\n
\n  \n 2017\n \n \n (3)\n \n \n
\n
\n \n \n
\n \n\n \n \n \n \n \n \n The Multilegged Autonomous eXplorer (MAX).\n \n \n \n \n\n\n \n Elfes, A.; Steindl, R.; Talbot, F.; Kendoul, F.; Sikka, P.; Lowe, T.; Kottege, N.; Bjelonic, M.; Dungavell, R.; Bandyopadhyay, T.; and others\n\n\n \n\n\n\n In IEEE International Conference on Robotics and Automation (ICRA), pages 1050–1057, 2017. \n \n\n\n\n
\n\n\n\n \n \n \"The pdf\n  \n \n \n \"The video\n  \n \n \n \"The link\n  \n \n\n \n \n doi\n  \n \n\n \n link\n  \n \n\n bibtex\n \n\n \n  \n \n abstract \n \n\n \n  \n \n 1 download\n \n \n\n \n \n \n \n \n \n \n\n  \n \n \n \n \n \n \n\n\n\n
\n
@inproceedings{elfes2017multilegged,\n  author    = {Elfes, Alberto and\n               Steindl, Ryan and\n               Talbot, Fletcher and\n               Kendoul, Farid and\n               Sikka, Pavan and\n               Lowe, Tom and\n               Kottege, Navinda and\n               Bjelonic, Marko and\n               Dungavell, Ross and\n               Bandyopadhyay, Tirthankar and\n               others},\n  title     = {The Multilegged Autonomous eXplorer (MAX)},\n  booktitle = {IEEE International Conference on Robotics and Automation (ICRA)},\n  year      = {2017},\n  pages     = {1050--1057},\n  doi       = {10.1109/ICRA.2017.7989126},\n  abstract  = {To address the goal of locomotion in very complex and difficult\n              terrains, the authors are developing a new class of Ultralight\n              Legged Robots. This paper presents the Multilegged Autonomous\n              eXplorer (MAX), an ultralight, six-legged robot for traversal and\n              exploration of challenging indoor and outdoor environments. The\n              design of MAX emphasizes a low mass/size ratio, high locomotion\n              efficiency, and high payload capability compared to total system\n              mass. MAX is 2.25 m tall at full height and has a mass of\n              approximately 60 kg, which makes it 5 to 20 times lighter than\n              robots of comparable size. MAX is a research vehicle to explore\n              modelling and control of Ultralight Legged Robots subject to\n              flexing, oscillations and swaying; algorithms for gait planning\n              and motion planning under uncertainty; and navigation planning for\n              traversal of complex 3D terrains. This paper presents the design\n              of MAX, provides an overview of the control system developed,\n              summarizes results from indoor and outdoor tests, discusses system\n              performance and outlines the challenges to be addressed next.},\n  keywords  = {legged locomotion, actuators},\n  url_pdf   = {files/2017_icra_elfes.pdf},\n  url_video = {https://youtu.be/nmELJzXl_Z8},\n  url_link  = {https://ieeexplore.ieee.org/document/7989126},\n}\n\n
\n
\n\n\n
\n To address the goal of locomotion in very complex and difficult terrains, the authors are developing a new class of Ultralight Legged Robots. This paper presents the Multilegged Autonomous eXplorer (MAX), an ultralight, six-legged robot for traversal and exploration of challenging indoor and outdoor environments. The design of MAX emphasizes a low mass/size ratio, high locomotion efficiency, and high payload capability compared to total system mass. MAX is 2.25 m tall at full height and has a mass of approximately 60 kg, which makes it 5 to 20 times lighter than robots of comparable size. MAX is a research vehicle to explore modelling and control of Ultralight Legged Robots subject to flexing, oscillations and swaying; algorithms for gait planning and motion planning under uncertainty; and navigation planning for traversal of complex 3D terrains. This paper presents the design of MAX, provides an overview of the control system developed, summarizes results from indoor and outdoor tests, discusses system performance and outlines the challenges to be addressed next.\n
\n\n\n
\n\n\n
\n \n\n \n \n \n \n \n \n Autonomous Navigation of Hexapod Robots with Vision-based Controller Adaptation.\n \n \n \n \n\n\n \n Bjelonic, M.; Homberger, T.; Kottege, N.; Borges, P.; Chli, M.; and Beckerle, P.\n\n\n \n\n\n\n In IEEE International Conference on Robotics and Automation (ICRA), pages 5561–5568, 2017. \n \n\n\n\n
\n\n\n\n \n \n \"Autonomous pdf\n  \n \n \n \"Autonomous video\n  \n \n \n \"Autonomous link\n  \n \n\n \n \n doi\n  \n \n\n \n link\n  \n \n\n bibtex\n \n\n \n  \n \n abstract \n \n\n \n  \n \n 1 download\n \n \n\n \n \n \n \n \n \n \n\n  \n \n \n \n \n \n \n \n \n\n\n\n
\n
@inproceedings{bjelonic2017autonomous,\n  author    = {Bjelonic, Marko and\n              Homberger, Timon and\n              Kottege, Navinda and\n              Borges, Paulo and\n              Chli, Margarita and\n              Beckerle, Philipp},\n  title     = {Autonomous Navigation of Hexapod Robots with Vision-based\n               Controller Adaptation},\n  booktitle = {IEEE International Conference on Robotics and Automation (ICRA)},\n  year      = {2017},\n  pages     = {5561--5568},\n  doi       = {10.1109/ICRA.2017.7989655},\n  abstract  = {This work introduces a novel hybrid control architecture for a\n               hexapod platform (Weaver), making it capable of autonomously\n               navigating in uneven terrain. The main contribution stems from\n               the use of vision-based exteroceptive terrain perception to adapt\n               the robot's locomotion parameters. Avoiding computationally\n               expensive path planning for the individual foot tips, the\n               adaptation controller enables the robot to reactively adapt to\n               the surface structure it is moving on. The virtual stiffness,\n               which mainly characterizes the behavior of the legs' impedance\n               controller is adapted according to visually perceived terrain\n               properties. To further improve locomotion, the frequency and\n               height of the robot's stride are similarly adapted. Furthermore,\n               novel methods for terrain characterization and a keyframe based\n               visual-inertial odometry algorithm are combined to generate a\n               spatial map of terrain characteristics. Localization via odometry\n               also allows for autonomous missions on variable terrain by\n               incorporating global navigation and terrain adaptation into one\n               control architecture. Autonomous runs on a testbed with variable\n               terrain types illustrate that adaptive stride and impedance\n               behavior decreases the cost of transport by 30 {\\%} compared to a\n               non-adaptive approach and simultaneously increases body stability\n               (up to 88 {\\%} on even terrain and by 54 {\\%} on uneven terrain).\n               Weaver is able to freely explore outdoor environments as it is\n               completely free of external tethers, as shown in the\n               experiments.},\n  keywords  = {legged locomotion, terrain perception, control adaptation},\n  url_pdf   = {files/2017_icra_bjelonic.pdf},\n  url_video = {https://youtu.be/a9l0bHHm-yY},\n  url_link  = {https://ieeexplore.ieee.org/document/7989655},\n}\n\n
\n
\n\n\n
\n This work introduces a novel hybrid control architecture for a hexapod platform (Weaver), making it capable of autonomously navigating in uneven terrain. The main contribution stems from the use of vision-based exteroceptive terrain perception to adapt the robot's locomotion parameters. Avoiding computationally expensive path planning for the individual foot tips, the adaptation controller enables the robot to reactively adapt to the surface structure it is moving on. The virtual stiffness, which mainly characterizes the behavior of the legs' impedance controller is adapted according to visually perceived terrain properties. To further improve locomotion, the frequency and height of the robot's stride are similarly adapted. Furthermore, novel methods for terrain characterization and a keyframe based visual-inertial odometry algorithm are combined to generate a spatial map of terrain characteristics. Localization via odometry also allows for autonomous missions on variable terrain by incorporating global navigation and terrain adaptation into one control architecture. Autonomous runs on a testbed with variable terrain types illustrate that adaptive stride and impedance behavior decreases the cost of transport by 30 % compared to a non-adaptive approach and simultaneously increases body stability (up to 88 % on even terrain and by 54 % on uneven terrain). Weaver is able to freely explore outdoor environments as it is completely free of external tethers, as shown in the experiments.\n
\n\n\n
\n\n\n
\n \n\n \n \n \n \n \n \n ANYmal-toward Legged Robots for Harsh Environments.\n \n \n \n \n\n\n \n Hutter, M.; Gehring, C.; Lauber, A.; Gunther, F; Bellicoso, C. D; Tsounis, V.; Fankhauser, P.; Diethelm, R.; Bachmann, S.; Blösch, M.; Kolvenbach, H.; Bjelonic, M.; and others\n\n\n \n\n\n\n Advanced Robotics,918–931. 2017.\n \n\n\n\n
\n\n\n\n \n \n \"ANYmal-toward pdf\n  \n \n \n \"ANYmal-toward link\n  \n \n\n \n \n doi\n  \n \n\n \n link\n  \n \n\n bibtex\n \n\n \n  \n \n abstract \n \n\n \n  \n \n 6 downloads\n \n \n\n \n \n \n \n \n \n \n\n  \n \n \n \n \n \n \n \n \n \n \n \n \n\n\n\n
\n
@article{hutter2017anymal,\n  author    = {Hutter, Marco and\n               Gehring, Christian and\n               Lauber, Andreas and\n               Gunther, F and\n               Bellicoso, Carmine D and\n               Tsounis, Vassilios and\n               Fankhauser, P{\\'e}ter and\n               Diethelm, Remo and\n               Bachmann, Samuel and\n               Bl{\\"o}sch, Michael and\n               Kolvenbach, Hendrik and\n               Bjelonic, Marko and\n               others},\n  title     = {ANYmal-toward Legged Robots for Harsh Environments},\n  journal   = {Advanced Robotics},\n  year      = {2017},\n  pages     = {918--931},\n  doi       = {10.1080/01691864.2017.1378591},\n  abstract  = {This paper provides a system overview about ANYmal, a quadrupedal\n               robot developed for operation in harsh environments. The 30 kg,\n               0.5 m tall robotic dog was built in a modular way for simple\n               maintenance and user-friendly handling, while focusing on high\n               mobility and dynamic motion capability. The system is tightly\n               sealed to reach IP67 standard and protected to survive falls.\n               Rotating lidar sensors in the front and back are used for\n               localization and terrain mapping and compact force sensors in the\n               feet provide accurate measurements about the contact situations.\n               The variable payload, such as a modular pan-tilt head with a\n               variety of inspection sensors, can be exchanged depending on the\n               application. Thanks to novel, compliant joint modules with\n               integrated electronics, ANYmal is precisely torque controllable\n               and very robust against impulsive loads during running or\n               jumping. In a series of experiments we demonstrate that ANYmal\n               can execute various climbing maneuvers, walking gaits, as well as\n               a dynamic trot and jump. As special feature, the joints can be\n               fully rotated to switch between X- and O-type kinematic\n               configurations. Detailed measurements unveil a low energy\n               consumption of 280 W during locomotion, which results in an\n               autonomy of more than 2 h.},\n  keywords  = {legged robot, quadruped robot, field robotics,\n               series elastic actuation, autonomous navigation},\n  url_pdf   = {files/2017_advanced_robotics_hutter.pdf},\n  url_link  = {https://doi.org/10.1080/01691864.2017.1378591},\n}\n\n
\n
\n\n\n
\n This paper provides a system overview about ANYmal, a quadrupedal robot developed for operation in harsh environments. The 30 kg, 0.5 m tall robotic dog was built in a modular way for simple maintenance and user-friendly handling, while focusing on high mobility and dynamic motion capability. The system is tightly sealed to reach IP67 standard and protected to survive falls. Rotating lidar sensors in the front and back are used for localization and terrain mapping and compact force sensors in the feet provide accurate measurements about the contact situations. The variable payload, such as a modular pan-tilt head with a variety of inspection sensors, can be exchanged depending on the application. Thanks to novel, compliant joint modules with integrated electronics, ANYmal is precisely torque controllable and very robust against impulsive loads during running or jumping. In a series of experiments we demonstrate that ANYmal can execute various climbing maneuvers, walking gaits, as well as a dynamic trot and jump. As special feature, the joints can be fully rotated to switch between X- and O-type kinematic configurations. Detailed measurements unveil a low energy consumption of 280 W during locomotion, which results in an autonomy of more than 2 h.\n
\n\n\n
\n\n\n\n\n\n
\n
\n\n
\n
\n  \n 2016\n \n \n (3)\n \n \n
\n
\n \n \n
\n \n\n \n \n \n \n \n \n Terrain-dependant Control of Hexapod Robots using Vision.\n \n \n \n \n\n\n \n Homberger, T.; Bjelonic, M.; Kottege, N.; and Borges, P. V.\n\n\n \n\n\n\n In International Symposium on Experimental Robotics (ISER), pages 92–102, 2016. \n \n\n\n\n
\n\n\n\n \n \n \"Terrain-dependant pdf\n  \n \n \n \"Terrain-dependant video\n  \n \n \n \"Terrain-dependant link\n  \n \n\n \n \n doi\n  \n \n\n \n link\n  \n \n\n bibtex\n \n\n \n  \n \n abstract \n \n\n \n  \n \n 4 downloads\n \n \n\n \n \n \n \n \n \n \n\n  \n \n \n \n \n \n \n\n\n\n
\n
@inproceedings{homberger2016terrain,\n  author    = {Homberger, Timon and\n               Bjelonic, Marko and\n               Kottege, Navinda and\n               Borges, Paulo VK},\n  title     = {Terrain-dependant Control of Hexapod Robots using Vision},\n  booktitle = {International Symposium on Experimental Robotics (ISER)},\n  year      = {2016},\n  pages     = {92--102},\n  doi       = {10.1007/978-3-319-50115-4_9},\n  abstract  = {The ability to traverse uneven terrain is one of the key\n               advantages of legged robots. However, their effectiveness relies\n               on selecting appropriate gait parameters, such as stride height\n               and leg stiffness. The optimal parameters highly depend on the\n               characteristics of the terrain. This work presents a novel stereo\n               vision based terrain sensing method for a hexapod robot with 30\n               degrees of freedom. The terrain in front of the robot is analyzed\n               by extracting a set of features which enable the system to\n               characterize a large number of terrain types. Gait parameters and\n               leg stiffness for impedance control are adapted based on this\n               terrain characterization. Experiments show that adaptive\n               impedance control leads to efficient locomotion in terms of\n               energy consumption, mission success and body stability.},\n  keywords  = {legged locomotion, terrain perception},\n  url_pdf   = {files/2016_iser_homberger.pdf},\n  url_video = {https://youtu.be/Rx4ewkxAItI},\n  url_link  = {https://link.springer.com/chapter/10.1007/978-3-319-50115-4_9},\n}\n\n
\n
\n\n\n
\n The ability to traverse uneven terrain is one of the key advantages of legged robots. However, their effectiveness relies on selecting appropriate gait parameters, such as stride height and leg stiffness. The optimal parameters highly depend on the characteristics of the terrain. This work presents a novel stereo vision based terrain sensing method for a hexapod robot with 30 degrees of freedom. The terrain in front of the robot is analyzed by extracting a set of features which enable the system to characterize a large number of terrain types. Gait parameters and leg stiffness for impedance control are adapted based on this terrain characterization. Experiments show that adaptive impedance control leads to efficient locomotion in terms of energy consumption, mission success and body stability.\n
\n\n\n
\n\n\n
\n \n\n \n \n \n \n \n \n Proprioceptive control of an over-actuated hexapod robot in unstructured terrain.\n \n \n \n \n\n\n \n Bjelonic, M.; Kottege, N.; and Beckerle, P.\n\n\n \n\n\n\n In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 2042–2049, 2016. \n \n\n\n\n
\n\n\n\n \n \n \"Proprioceptive pdf\n  \n \n \n \"Proprioceptive video\n  \n \n \n \"Proprioceptive link\n  \n \n\n \n \n doi\n  \n \n\n \n link\n  \n \n\n bibtex\n \n\n \n  \n \n abstract \n \n\n \n  \n \n 7 downloads\n \n \n\n \n \n \n \n \n \n \n\n  \n \n \n \n \n \n \n\n\n\n
\n
@inproceedings{bjelonic2016proprioceptive,\n  author    = {Bjelonic, Marko and\n               Kottege, Navinda and\n               Beckerle, Philipp},\n  title     = {Proprioceptive control of an over-actuated hexapod robot in\n               unstructured terrain},\n  booktitle = {IEEE/RSJ International Conference on Intelligent Robots and\n               Systems (IROS)},\n  year      = {2016},\n  pages     = {2042--2049},\n  doi       = {10.1109/IROS.2016.7759321},\n  abstract  = {Legged robots such as hexapods have the potential to traverse\n              unstructured terrain. This paper introduces a novel hexapod robot\n              (Weaver) using a hierarchical controller, with the ability to\n              efficiently traverse uneven and inclined terrain. The robot has\n              five joints per leg and 30 degrees of freedom overall. The two\n              redundant joints improve the locomotion of the robot by\n              controlling the body pose and the leg orientation with respect to\n              the ground. The impedance controller in Cartesian space reacts to\n              unstructured terrain and thus achieves self-stabilizing behavior\n              without prior profiling of the terrain through exteroceptive\n              sensing. Instead of adding force sensors, the force at the foot\n              tip is calculated by processing the current signals of the\n              actuators. This work experimentally evaluates Weaver with the\n              proposed controller and demonstrates that it can effectively\n              traverse challenging terrains and high gradient slopes, reduce\n              angular movements of the body by more than 55{\\%} and reduce the cost\n              of transport (up to 50{\\%} on uneven terrain and by 85% on a slope\n              with 20 °). The controller also enables Weaver to walk up\n              inclines of up to 30 °, and remain statically stable on inclines\n              up to 50 °. Furthermore, we present a new metric for legged robot\n              stability performance along with a method for proprioceptive\n              terrain characterization.},\n  keywords  = {legged locomotion, impedance control},\n  url_pdf   = {files/2016_iros_bjelonic.pdf},\n  url_video = {https://youtu.be/OxOMmovPpdI},\n  url_link  = {https://ieeexplore.ieee.org/document/7759321},\n}\n\n
\n
\n\n\n
\n Legged robots such as hexapods have the potential to traverse unstructured terrain. This paper introduces a novel hexapod robot (Weaver) using a hierarchical controller, with the ability to efficiently traverse uneven and inclined terrain. The robot has five joints per leg and 30 degrees of freedom overall. The two redundant joints improve the locomotion of the robot by controlling the body pose and the leg orientation with respect to the ground. The impedance controller in Cartesian space reacts to unstructured terrain and thus achieves self-stabilizing behavior without prior profiling of the terrain through exteroceptive sensing. Instead of adding force sensors, the force at the foot tip is calculated by processing the current signals of the actuators. This work experimentally evaluates Weaver with the proposed controller and demonstrates that it can effectively traverse challenging terrains and high gradient slopes, reduce angular movements of the body by more than 55% and reduce the cost of transport (up to 50% on uneven terrain and by 85% on a slope with 20 °). The controller also enables Weaver to walk up inclines of up to 30 °, and remain statically stable on inclines up to 50 °. Furthermore, we present a new metric for legged robot stability performance along with a method for proprioceptive terrain characterization.\n
\n\n\n
\n\n\n
\n \n\n \n \n \n \n \n \n YOLO ROS: Real-Time Object Detection for ROS.\n \n \n \n \n\n\n \n Bjelonic, M.\n\n\n \n\n\n\n https://github.com/leggedrobotics/darknet_ros, 2016.\n \n\n\n\n
\n\n\n\n \n \n \"YOLO link\n  \n \n\n \n\n \n link\n  \n \n\n bibtex\n \n\n \n\n \n  \n \n 4 downloads\n \n \n\n \n \n \n \n \n \n \n\n  \n \n \n\n\n\n
\n
@misc{bjelonicYolo2018,\n  author = {Bjelonic, Marko},\n  title = {{YOLO ROS}: Real-Time Object Detection for {ROS}},\n  howpublished = {{https://github.com/leggedrobotics/darknet_ros}},\n  year = {2016},\n  url_link = {https://github.com/leggedrobotics/darknet_ros},\n}\n\n
\n
\n\n\n\n
\n\n\n\n\n\n
\n
\n\n\n\n\n
\n\n\n \n\n \n \n \n \n\n
\n"}; document.write(bibbase_data.data);