Research Article
From Speech to Emotional Interaction: EmotiRob Project
@INPROCEEDINGS{10.1007/978-3-642-19385-9_8, author={Marc Tallec and S\^{e}bastien Saint-Aim\^{e} and C\^{e}line Jost and Jeanne Villaneau and Jean-Yves Antoine and Sabine Letellier-Zarshenas and Brigitte Le-P\^{e}v\^{e}dic and Dominique Duhaut}, title={From Speech to Emotional Interaction: EmotiRob Project}, proceedings={Human-Robot Personal Relationships. Third International Conference, HRPR 2010, Leiden, The Netherlands, June 23-24, 2010, Revised Selected Papers}, proceedings_a={HRPR}, year={2012}, month={5}, keywords={emotion interaction companion robot dynamic behavior}, doi={10.1007/978-3-642-19385-9_8} }
- Marc Tallec
Sébastien Saint-Aimé
Céline Jost
Jeanne Villaneau
Jean-Yves Antoine
Sabine Letellier-Zarshenas
Brigitte Le-Pévédic
Dominique Duhaut
Year: 2012
From Speech to Emotional Interaction: EmotiRob Project
HRPR
Springer
DOI: 10.1007/978-3-642-19385-9_8
Abstract
This article presents research work done in the domain of nonverbal emotional interaction for the EmotiRob project. It is a component of the MAPH project, the objective of which is to give comfort to vulnerable children and/or those undergoing long-term hospitalisation through the help of an emotional robot companion. It is important to note that we are not trying to reproduce human emotion and behavior, but trying to make a robot emotionally expressive. This paper will present the different hypotheses we have used from understanding to emotional reaction. We begin the article with a presentation of the MAPH and EmotiRob project. Then, we quickly describe the speech undestanding system, the iGrace computational model of emotions and integration of dynamics behavior. We conclude with a description of the architecture of Emi, as well as improvements to be made to its next generation.