EAI Endorsed Transactions on Future Intelligent Educational Environments 16(6): e6

Research Article

Observing, Coaching and Reflecting: Metalogue - A Multi-modal Tutoring System with Metacognitive Abilities

Download50 downloads
  • @ARTICLE{10.4108/eai.27-6-2016.151525,
        author={Joy Van Helvert and Volha Petukhova and Christopher Stevens and Harmen de Weerd and Dirk B\o{}rner and Peter Van Rosmalen and Jan Alexandersson and Niels Taatgen},
        title={Observing, Coaching and Reflecting: Metalogue - A Multi-modal Tutoring System with Metacognitive Abilities},
        journal={EAI Endorsed Transactions on Future Intelligent Educational Environments},
        volume={16},
        number={6},
        publisher={EAI},
        journal_a={FIEE},
        year={2016},
        month={6},
        keywords={natural conversational interaction, mixed-reality, multi-modal dialogue systems, immersive, debate skills, learning analytics, reflection, negotiation},
        doi={10.4108/eai.27-6-2016.151525}
    }
    
  • Joy Van Helvert
    Volha Petukhova
    Christopher Stevens
    Harmen de Weerd
    Dirk Börner
    Peter Van Rosmalen
    Jan Alexandersson
    Niels Taatgen
    Year: 2016
    Observing, Coaching and Reflecting: Metalogue - A Multi-modal Tutoring System with Metacognitive Abilities
    FIEE
    EAI
    DOI: 10.4108/eai.27-6-2016.151525
Joy Van Helvert1,*, Volha Petukhova2, Christopher Stevens3, Harmen de Weerd3, Dirk Börner4, Peter Van Rosmalen4, Jan Alexandersson5, Niels Taatgen3
  • 1: University of Essex
  • 2: Spoken Language Systems, Saarland University, Germany
  • 3: Institute of Artificial Intelligence, University of Groningen
  • 4: Open University of the Netherlands
  • 5: German Research Centre for Artificial Intelligence GmbH
*Contact email: jvanhe@essex.ac.uk

Abstract

The Metalogue project aims to develop a multi-modal, multi-party, dialogue system with metacognitive abilities that will advance our understanding of natural conversational human-machine interaction, and interfaces that incorporate multimodality into virtual and augmented reality environments. In this paper we describe the envisaged technical system, the learning contexts it is being developed to support and the pedagogical framework in which it is proposed user interactions will take place. This includes details of the system-generated learner feedback provided both in-performance and post-performance.We then move on to explain what has been achieved so far in terms of the integrated system pilots, and finally we discuss three key challenges the Metalogue researchers are currently working to overcome.