4th International Conference on Wireless Mobile Communication and Healthcare - "Transforming healthcare through innovations in mobile and wireless technologies"

Research Article

Towards an Intelligent Robotic Walker for Assisted Living using Multimodal Sensorial Data

Download664 downloads
  • @INPROCEEDINGS{10.4108/icst.mobihealth.2014.257358,
        author={Georgia Chalvatzaki and Georgios Pavlakos and Kevis Maninis and Xanthi Papageorgiou and Vassilis Pitsikalis and Costas Tzafestas and Petros Maragos},
        title={Towards an Intelligent Robotic Walker for Assisted Living using Multimodal Sensorial Data},
        proceedings={4th International Conference on Wireless Mobile Communication and Healthcare - "Transforming healthcare through innovations in mobile and wireless technologies"},
        publisher={IEEE},
        proceedings_a={MOBIHEALTH},
        year={2014},
        month={12},
        keywords={assisted living for the elderly visual action recognition laser sensor gait analysis and tracking},
        doi={10.4108/icst.mobihealth.2014.257358}
    }
    
  • Georgia Chalvatzaki
    Georgios Pavlakos
    Kevis Maninis
    Xanthi Papageorgiou
    Vassilis Pitsikalis
    Costas Tzafestas
    Petros Maragos
    Year: 2014
    Towards an Intelligent Robotic Walker for Assisted Living using Multimodal Sensorial Data
    MOBIHEALTH
    IEEE
    DOI: 10.4108/icst.mobihealth.2014.257358
Georgia Chalvatzaki1,*, Georgios Pavlakos1, Kevis Maninis1, Xanthi Papageorgiou1, Vassilis Pitsikalis1, Costas Tzafestas1, Petros Maragos1
  • 1: School of Electrical and Computer Engineering, National Technical Univ. of Athens, Greece
*Contact email: gchal@central.ntua.gr

Abstract

We aim at developing an intelligent robotic platform that provides cognitive assistance and natural support in indoor environments to the elderly society and to individuals with moderate to mild walking impairment. Towards this end, we process data from audiovisual sensors and laser range scanners, acquired in experiments with patients in real life scenarios. We present the main concepts of an automatic system for user intent and action recognition that will integrate multiple modalities. We demonstrate promising preliminary results, firstly on action recognition based on the visual modality, i.e. color and depth cues, and secondly on the detection of gait cycle patterns that exploit the laser range data. For action recognition we are based on local interest points, 3D Gabor filters and dominant energy analysis, feeding a support vector machine. Then the recognized actions can trigger the gait cycle detection that detect walking patterns by exploiting the laser range data, modeled by hidden Markov models. In this way, we shall acquire the overall patient’s state and the robot shall autonomously reason on how to provide support.