ct 17(10): e2

Research Article

Towards Interactive Agents that Infer Emotions from Voice and Context Information

Download1238 downloads
  • @ARTICLE{10.4108/eai.4-9-2017.153054,
        author={D. Formolo and T. Bosse},
        title={Towards Interactive Agents that Infer Emotions from Voice and Context Information},
        journal={EAI Endorsed Transactions on Creative Technologies},
        volume={4},
        number={10},
        publisher={EAI},
        journal_a={CT},
        year={2017},
        month={1},
        keywords={speech analysis, voice, emotions, context, virtual agents, social skills training.},
        doi={10.4108/eai.4-9-2017.153054}
    }
    
  • D. Formolo
    T. Bosse
    Year: 2017
    Towards Interactive Agents that Infer Emotions from Voice and Context Information
    CT
    EAI
    DOI: 10.4108/eai.4-9-2017.153054
D. Formolo1,*, T. Bosse1
  • 1: Vrije Universiteit Amsterdam - Department of Computer Sciences, De Boelelaan 1081a, 1081HV Amsterdam, The Netherlands
*Contact email: d.formolo@vu.nl

Abstract

Conversational agents are increasingly being used for training of social skills. One of their most important benefits is their ability to understand the user`s emotions, to be able to provide natural interaction with humans. However, to infer a conversation partner’s emotional state, humans typically make use of contextual information as well. This work proposes an architecture to extract emotions from human voice in combination with the context imprint of a particular situation. With that information, a computer system can achieve a more human-like type of interaction. The architecture presents satisfactory results. The strategy of combining 2 algorithms, one to cover ‘common cases’ and another to cover ‘borderline cases’ significantly reduces the percentage of mistakes in classification. The addition of context information also increases the accuracy in emotion inferences.