With an increase in the demand for complex tasks in industrial applications such as assembly tasks, industries foresee demand for robots with unique abilities. For example, there is a requirement for more customizable robots and adapt to dynamic changes in the environment. Interactive tasks are such complex tasks where robots should adapt to changes to interact with the environment. Such tasks are also called contact-based tasks or compliant tasks. The robot performing compliant tasks should be able to possess the skills associated not only with kinematic movements but also force profiles and corresponding control schemes. Programming a robot to execute contact-based tasks can be time-consuming, where usually expert knowledge is required. Learning from Demonstration (LfD) provides an intuitive way to deal with such complex tasks with minimal programming effort. As such, demonstrations might not always lead to an efficient behavior, but the intent of the user can be recognized based on the motion and force data. The goal is to extract the real intent of the user, which is to exhibit contact-based skills that are adaptable to changes and not simply replay the demonstration. Skill templates need to be developed that are parameterized by the demonstration to reproduce the skill efficiently. This thesis aims to provide a methodology to identify such skill templates for commonly used industrial tasks such as slide, e.g., to surface polishing, touch to slight contact to identify the objects or constraints, press to apply forces on to the environment for e.g., pressing a button and contouring to perform tasks such as deburring of manufactured parts. This thesis presents methods employed to identify and extract these features required to represent a skill template capable of reproducing desired skills. Additionally, a control strategy is derived for hybrid position-force control to reproduce the skills from skill templates. The methodologies employed are evaluated, and the implications are inferred by reproducing contact-based skills under PyBullet simulation environment configured with a LWR-IV robot.


    Zugriff

    Download


    Exportieren, teilen und zitieren



    Titel :

    Recognition and Reproduction of Force-based Robot Skills via Learning from Demonstration


    Beteiligte:

    Erscheinungsdatum :

    15.04.2021


    Medientyp :

    Sonstige


    Format :

    Elektronische Ressource


    Sprache :

    Englisch




    Learning force-based robot skills from haptic demonstration

    Rozo Castañeda, Leonel / Jimenez Schlegl, Pablo / Torras, Carme | BASE | 2010

    Freier Zugriff

    Learning force-based robot skills from haptic demonstration

    Rozo, Leonel / Jiménez Schlegl, Pablo / Torras, Carme | BASE | 2010

    Freier Zugriff



    Robot learning of container-emptying skills through haptic demonstration

    Rozo, Leonel / Jiménez Schlegl, Pablo / Torras, Carme | BASE | 2009

    Freier Zugriff