Human-Robot Personal Relationships. Third International Conference, HRPR 2010, Leiden, The Netherlands, June 23-24, 2010, Revised Selected Papers

Research Article

From Speech to Emotional Interaction: EmotiRob Project

Download287 downloads
  • @INPROCEEDINGS{10.1007/978-3-642-19385-9_8,
        author={Marc Tallec and S\^{e}bastien Saint-Aim\^{e} and C\^{e}line Jost and Jeanne Villaneau and Jean-Yves Antoine and Sabine Letellier-Zarshenas and Brigitte Le-P\^{e}v\^{e}dic and Dominique Duhaut},
        title={From Speech to Emotional Interaction: EmotiRob Project},
        proceedings={Human-Robot Personal Relationships. Third International Conference, HRPR 2010, Leiden, The Netherlands, June 23-24, 2010, Revised Selected Papers},
        proceedings_a={HRPR},
        year={2012},
        month={5},
        keywords={emotion interaction companion robot dynamic behavior},
        doi={10.1007/978-3-642-19385-9_8}
    }
    
  • Marc Tallec
    Sébastien Saint-Aimé
    Céline Jost
    Jeanne Villaneau
    Jean-Yves Antoine
    Sabine Letellier-Zarshenas
    Brigitte Le-Pévédic
    Dominique Duhaut
    Year: 2012
    From Speech to Emotional Interaction: EmotiRob Project
    HRPR
    Springer
    DOI: 10.1007/978-3-642-19385-9_8
Marc Tallec1,*, Sébastien Saint-Aimé2,*, Céline Jost2,*, Jeanne Villaneau2,*, Jean-Yves Antoine1,*, Sabine Letellier-Zarshenas2,*, Brigitte Le-Pévédic2,*, Dominique Duhaut2,*
  • 1: University François-Rabelais
  • 2: University of Bretagne Sud
*Contact email: marc.letallec@univ-tours.fr, sebastien.saint-aime@univ-ubs.fr, celine.jost@univ-ubs.fr, jeanne.villaneau@univ-ubs.fr, jean-yves.antoine@univ-tours.fr, sabine.letellier@univ-ubs.fr, brigitte.le-pevedic@univ-ubs.fr, dominique.duhaut@univ-ubs.fr

Abstract

This article presents research work done in the domain of nonverbal emotional interaction for the EmotiRob project. It is a component of the MAPH project, the objective of which is to give comfort to vulnerable children and/or those undergoing long-term hospitalisation through the help of an emotional robot companion. It is important to note that we are not trying to reproduce human emotion and behavior, but trying to make a robot emotionally expressive. This paper will present the different hypotheses we have used from understanding to emotional reaction. We begin the article with a presentation of the MAPH and EmotiRob project. Then, we quickly describe the speech undestanding system, the iGrace computational model of emotions and integration of dynamics behavior. We conclude with a description of the architecture of Emi, as well as improvements to be made to its next generation.