Human affective state extracted from touch interaction takes advantage of natural communication of emotion through physical contact, enabling applications like robot therapy, intelligent tutoring systems, emotionally-reactive smart tech, and more. This work focused on the emotionally aware robot pet context and produced a custom, low-cost piezoresistive fabric touch sensor at 1-inch taxel resolution that accommodates the flex and stretch of the robot in motion. Using established machine learning techniques, we built classification models of social and emotional touch data. We present an iteration of the human-robot interaction loop for an emotionally aware robot through two distinct studies and demonstrate gesture recognition at roughly 85% accuracy (chance 14%). The first study collected social touch gesture data (N=26) to assess data quality of our custom sensor under noisy conditions: mounted on a robot skeleton simulating regular breathing, obscured under fur casings, placed over deformable surfaces. Our second study targeted affect with the same sensor, wherein participants (N=30) relived emotionally intense memories while interacting with a smaller stationary robot, generating touch data imbued with the following: Stressed, Excited, Relaxed, or Depressed. A feature space analysis triangulating touch, gaze, and physiological data highlighted the dimensions of touch that suggest affective state. To close the interactive loop, we had participants (N=20) evaluate researcherdesigned breathing behaviours on 1-DOF robots for emotional content. Results demonstrate that these behaviours can display human-recognizable emotion as perceptual affective qualities across the valence-arousal emotion model. Finally, we discuss the potential impact of a system capable of emotional “conversation” with human users, referencing specific applications. ; Science, Faculty of ; Computer Science, Department of ; Graduate


    Zugriff

    Download


    Exportieren, teilen und zitieren



    Titel :

    Towards an emotionally communicative robot : feature analysis for multimodal support of affective touch recognition


    Beteiligte:
    Cang, Xi Laura (Autor:in)

    Erscheinungsdatum :

    01.01.2016


    Medientyp :

    Hochschulschrift


    Format :

    Elektronische Ressource


    Sprache :

    Englisch


    Klassifikation :

    DDC:    629




    Recognition of Affective Communicative Intent in Robot-Directed Speech

    Breazeal, C. / Aryananda, L. | British Library Online Contents | 2002



    Affective Touch in Human–Robot Interaction : Conveying Emotion to the Nao Robot

    Andreasson, Rebecca / Alenljung, Beatrice / Billing, Erik et al. | BASE | 2018

    Freier Zugriff

    Affective touch in human–robot interaction : Conveying emotion to the Nao robot

    Andreasson, Rebecca / Alenljung, Beatrice / Billing, Erik et al. | BASE | 2018

    Freier Zugriff

    Comparing soft robotic affective touch to human and brush affective touch

    Zheng, CY / Wang, K-J / Wairagkar, M et al. | BASE | 2021

    Freier Zugriff