Legged robots have demonstrated remarkable advances regarding robustness and versatility in the past decades. The questions that need to be addressed in this field are increasingly focusing on reasoning about the environment and autonomy rather than locomotion only. To answer some of these questions visual information is essential. If a robot has information about the terrain it can plan and take preventive actions against potential risks. However, building a model of the terrain is often computationally costly, mainly because of the dense nature of visual data. On top of the mapping problem, robots need feasible body trajectories and contact sequences to traverse the terrain safely, which may also require heavy computations. This computational cost has limited the use of visual feedback to contexts that guarantee (quasi-) static stability, or resort to planning schemes where contact sequences and body trajectories are computed before starting to execute motions. In this thesis we propose a set of algorithms that reduces the gap between visual processing and dynamic locomotion. We use machine learning to speed up visual data processing and model predictive control to achieve locomotion robustness. In particular, we devise a novel foothold adaptation strategy that uses a map of the terrain built from on-board vision sensors. This map is sent to a foothold classifier based on a convolutional neural network that allows the robot to adjust the landing position of the feet in a fast and continuous fashion. We then use the convolutional neural network-based classifier to provide safe future contact sequences to a model predictive controller that optimizes target ground reaction forces in order to track a desired center of mass trajectory. We perform simulations and experiments on the hydraulic quadruped robots HyQ and HyQReal. For all experiments the contact sequences, the foothold adaptations, the control inputs and the map are computed and processed entirely on-board. The various tests show that the robot is able to ...


    Access

    Download


    Export, share and cite



    Title :

    Bridging Vision and Dynamic Legged Locomotion



    Publication date :

    2020-02-13


    Remarks:

    doi:10.15167/villarreal-magana-octavio-antonio_phd2020-02-13



    Type of media :

    Theses


    Type of material :

    Electronic Resource


    Language :

    English



    Classification :

    DDC:    629




    Rotational legged locomotion

    Lyons, D.M. / Pamnany, K. | IEEE | 2005


    LEG CONFIGURATION FOR SPRING-MASS LEGGED LOCOMOTION

    HURST JONATHAN / JONES MIKHAIL SOBIEGRAJ / ABATE ANDREW MARTIN | European Patent Office | 2016

    Free access


    On Terrain-Aware Locomotion for Legged Robots

    FAHMI, AHMED MOHAMED SHAMEL BAHAAELDEEN | BASE | 2021

    Free access