2nd International ICST Conference on Pervasive Computing Technologies for Healthcare

Research Article

Using Eye Contact and Contextual Speech Recognition for Hands-Free Surgical Charting

Download529 downloads
  • @INPROCEEDINGS{10.4108/ICST.PERVASIVEHEALTH2008.2548,
        author={G. Julian Lepinski and Roel Vertegaal},
        title={Using Eye Contact and Contextual Speech Recognition for Hands-Free Surgical Charting},
        proceedings={2nd International ICST Conference on Pervasive Computing Technologies for Healthcare},
        publisher={IEEE},
        proceedings_a={PERVASIVEHEALTH},
        year={2008},
        month={7},
        keywords={Attentive UI charting gaze voice recognition},
        doi={10.4108/ICST.PERVASIVEHEALTH2008.2548}
    }
    
  • G. Julian Lepinski
    Roel Vertegaal
    Year: 2008
    Using Eye Contact and Contextual Speech Recognition for Hands-Free Surgical Charting
    PERVASIVEHEALTH
    ICST
    DOI: 10.4108/ICST.PERVASIVEHEALTH2008.2548
G. Julian Lepinski1,*, Roel Vertegaal1,*
  • 1: Human Media Lab, Queen’s University, Kingston, Canada.
*Contact email: lepinski@cs.queensu.ca, roel@cs.queensu.ca

Abstract

In this paper we discuss ongoing research into applications for multimodal Attentive User Interfaces in handsfree charting during surgical procedures. Although speech recognition has matured enough that it can now be used for some software and hardware control, speech recognition solutions still have trouble filtering “command speech” from “ambient speech.” Our research builds on previous research that couples eye contact sensing with speech recognition to gauge intent. Users enable a voice activation system used for surgical time charting by fixing their gaze on a small camera before speaking command words.