The recent generation of compliant robots enables kinesthetic teaching of novel skills by human demonstration. This enables strategies to transfer tasks to the robot in a more intuitive way than conventional programming interfaces. Programming physical interactions can be achieved by manually guiding the robot to learn the behavior from the motion and force data. To let the robot react to changes in the environment, force sensing can be used to identify constraints and act accordingly. While autonomous exploration strategies in the whole workspace are time consuming, we propose a way to learn these schemes from human demonstrations in an object targeted manner. The presented teaching strategy and the learning framework allow to generate adaptive robot behaviors relying on the robot's sense of touch in a systematically changing environment. A generated behavior consists of a hierarchical representation of skills, where haptic exploration skills are used to touch the environment with the end effector, and relative manipulation skills, which are parameterized according to previous exploration events. The effectiveness of the approach has been proven in a manipulation task, where the adaptive task structure is able to generalize to unseen object locations. The robot autonomously manipulates objects without relying on visual feedback.
Learning haptic exploration schemes for adaptive task execution
2019 ; Montreal, Canada
2019
Conference paper
Electronic Resource
English
Springer Verlag | 2008
|Wiley | 2017
|Uniting Haptic Exploration and Display
Springer Verlag | 2003
|Influence of haptic guidance in learning a novel visuomotor task
British Library Conference Proceedings | 2009
|SECOND TASK EXECUTION ASSISTANCE DEVICE AND SECOND TASK EXECUTION ASSISTANCE PROGRAM
European Patent Office | 2021
|