Research Article
Using Eye Contact and Contextual Speech Recognition for Hands-Free Surgical Charting
@INPROCEEDINGS{10.4108/ICST.PERVASIVEHEALTH2008.2548, author={G. Julian Lepinski and Roel Vertegaal}, title={Using Eye Contact and Contextual Speech Recognition for Hands-Free Surgical Charting}, proceedings={2nd International ICST Conference on Pervasive Computing Technologies for Healthcare}, publisher={IEEE}, proceedings_a={PERVASIVEHEALTH}, year={2008}, month={7}, keywords={Attentive UI charting gaze voice recognition}, doi={10.4108/ICST.PERVASIVEHEALTH2008.2548} }
- G. Julian Lepinski
Roel Vertegaal
Year: 2008
Using Eye Contact and Contextual Speech Recognition for Hands-Free Surgical Charting
PERVASIVEHEALTH
ICST
DOI: 10.4108/ICST.PERVASIVEHEALTH2008.2548
Abstract
In this paper we discuss ongoing research into applications for multimodal Attentive User Interfaces in handsfree charting during surgical procedures. Although speech recognition has matured enough that it can now be used for some software and hardware control, speech recognition solutions still have trouble filtering “command speech” from “ambient speech.” Our research builds on previous research that couples eye contact sensing with speech recognition to gauge intent. Users enable a voice activation system used for surgical time charting by fixing their gaze on a small camera before speaking command words.
Copyright © 2008–2024 ICST