1st Intenational ICST Conference on Immersive Telecommunications & Workshops

Research Article

Two gesture recognition systems for immersive math education of the Deaf

Download187 downloads
  • @INPROCEEDINGS{10.4108/ICST.IMMERSCOM2007.2081,
        author={Nicoletta Adamo-Villani and Justin Heisler and Laura Arns},
        title={Two gesture recognition systems for immersive math education of the Deaf},
        proceedings={1st Intenational ICST Conference on Immersive Telecommunications \& Workshops},
        proceedings_a={IMMERSCOM},
        year={2010},
        month={5},
        keywords={Sign language recognition HCI Virtual Environments Deafeducation},
        doi={10.4108/ICST.IMMERSCOM2007.2081}
    }
    
  • Nicoletta Adamo-Villani
    Justin Heisler
    Laura Arns
    Year: 2010
    Two gesture recognition systems for immersive math education of the Deaf
    IMMERSCOM
    ICST
    DOI: 10.4108/ICST.IMMERSCOM2007.2081
Nicoletta Adamo-Villani1,*, Justin Heisler2,*, Laura Arns3,*
  • 1: Purdue University Department of Computer GraphicsTechnology West Lafayette, IN, USA. 001.765.496.1297
  • 2: Vicarious Visions 185 Van Renssalaer Blvd #13-1a Menands, New York 12204 001.765.418.1609
  • 3: Purdue University Envision Center for Data Perceptualization West Lafayette, IN, USA. 001.765.496.7888
*Contact email: nadamovi@purdue.edu, heisler.justin@gmail.com, larns@purdue.edu

Abstract

The general goal of our research is the creation of a natural and intuitive interface for navigation, interaction, and input/recognition of American Sign Language (ASL) math signs in immersive Virtual Environments (VE) for the Deaf. The specific objective of this work is the development of two new gesture recognition systems for SMILE™, an immersive learning game that employs a fantasy 3D virtual environment to engage deaf children in math-based educational tasks. Presently, SMILE includes standard VR interaction devices such as a 6DOF wand, a pair of pinch gloves, and a dance platform. In this paper we show a significant improvement of the application by proposing two new gesture control mechanisms: system (1) is based entirely on hand gestures and makes use of a pair of 18-sensor data gloves, system (2) is based on hand and body gestures and makes use of a pair of data gloves and a motion tracking system. Both interfaces support first-person motion control, object selection and manipulation, and real-time input/ recognition of ASL numbers zero to twenty. Although the systems described in the paper rely on high-end, expensive hardware, they can be considered a first step toward the realization of an effective immersive sign language interface.