About | Contact Us | Register | Login
ProceedingsSeriesJournalsSearchEAI
ArtsIT, Interactivity and Game Creation. Creative Heritage. New Perspectives from Media Arts and Artificial Intelligence. 10th EAI International Conference, ArtsIT 2021, Virtual Event, December 2-3, 2021, Proceedings

Research Article

Reconstructing Facial Expressions of HMD Users for Avatars in VR

Download(Requires a free EAI acccount)
3 downloads
Cite
BibTeX Plain Text
  • @INPROCEEDINGS{10.1007/978-3-030-95531-1_5,
        author={Christian Felix Purps and Simon Janzer and Matthias W\o{}lfel},
        title={Reconstructing Facial Expressions of HMD Users for Avatars in VR},
        proceedings={ArtsIT, Interactivity and Game Creation. Creative Heritage. New Perspectives from Media Arts and Artificial Intelligence. 10th EAI International Conference, ArtsIT 2021, Virtual Event, December 2-3, 2021, Proceedings},
        proceedings_a={ARTSIT},
        year={2022},
        month={2},
        keywords={Facial expressions Avatars HMD Virtual reality},
        doi={10.1007/978-3-030-95531-1_5}
    }
    
  • Christian Felix Purps
    Simon Janzer
    Matthias Wölfel
    Year: 2022
    Reconstructing Facial Expressions of HMD Users for Avatars in VR
    ARTSIT
    Springer
    DOI: 10.1007/978-3-030-95531-1_5
Christian Felix Purps,*, Simon Janzer, Matthias Wölfel
    *Contact email: christian_felix.purps@h-ka.de.com

    Abstract

    Real-time recognition of human facial expressions and their transfer and use in software is now established and can be found in a variety of computer applications. Most solutions, however, do not focus on facial recognition to be used in combination with wearing a head-mounted display. In these cases, the face is partially obscured, and approaches that assume a fully visible face are not applicable. To overcome this limitation, we present a systematic approach that covers the entire pipeline from facial expression recognition using RGB images to real-time facial animation of avatars based on blendshapes for virtual reality applications. To achieve this, we (a) developed a three-stage machine learning pipeline to recognize mouth areas, extract anthropological landmarks, and detect facial muscle activations and (b) created a realistic avatar using photogrammetry, 3D modeling, and applied blendshapes that closely follow the facial action coding system (FACS). This provides an interface to our facial expression recognition system, but also allows other blendshape-oriented approaches to work with our avatar. Our facial expression recognition system performed well on common metrics and under real-time testing. Jitter and the detection or approximation of upper face facial features, however, are still an issue that needs to be addressed.

    Keywords
    Facial expressions Avatars HMD Virtual reality
    Published
    2022-02-10
    Appears in
    SpringerLink
    http://dx.doi.org/10.1007/978-3-030-95531-1_5
    Copyright © 2021–2025 ICST
    EBSCOProQuestDBLPDOAJPortico
    EAI Logo

    About EAI

    • Who We Are
    • Leadership
    • Research Areas
    • Partners
    • Media Center

    Community

    • Membership
    • Conference
    • Recognition
    • Sponsor Us

    Publish with EAI

    • Publishing
    • Journals
    • Proceedings
    • Books
    • EUDL