5th International ICST Conference on Pervasive Computing Technologies for Healthcare

Research Article

Location of an Inhabitant for Domotic Assistance Through Fusion of Audio and Non-Visual Data

Download450 downloads
  • @INPROCEEDINGS{10.4108/icst.pervasivehealth.2011.246054,
        author={Pedro Chahuara and Fran\`{e}ois Portet and Michel Vacher},
        title={Location of an Inhabitant for Domotic Assistance  Through Fusion of Audio and Non-Visual Data},
        proceedings={5th International ICST Conference on Pervasive Computing Technologies for Healthcare},
        publisher={IEEE},
        proceedings_a={PERVASIVEHEALTH},
        year={2012},
        month={4},
        keywords={dynamic network data fusion localisation speech processing smart home},
        doi={10.4108/icst.pervasivehealth.2011.246054}
    }
    
  • Pedro Chahuara
    François Portet
    Michel Vacher
    Year: 2012
    Location of an Inhabitant for Domotic Assistance Through Fusion of Audio and Non-Visual Data
    PERVASIVEHEALTH
    ICST
    DOI: 10.4108/icst.pervasivehealth.2011.246054
Pedro Chahuara1,*, François Portet1, Michel Vacher1
  • 1: Laboratoire d'Informatique de Grenoble
*Contact email: pedro.chahuara@imag.fr

Abstract

In this paper, a new method to locate a person using multimodal non-visual sensors and microphones in a pervasive environment is presented. The information extracted from sensors is combined using a two-level dynamic network to obtain the location hypotheses. This method was tested within two smart homes using data from experiments involving about 25 participants. The preliminary results show that an accuracy of 90% can be reached using several uncertain sources. The use of implicit localisation sources, such as speech recognition, mainly used in this project for voice command, can improve performances in many cases.