REHAB 2014

Research Article

Eye Gaze Patterns after Stroke: Correlates of a VR Action Execution and Observation Task

Download584 downloads
  • @INPROCEEDINGS{10.4108/icst.pervasivehealth.2014.255288,
        author={J\^{u}lio Alves and Athanasios Vourvopoulos and Alexandre Bernardino and Sergi Berm\^{u}dez i Badia},
        title={Eye Gaze Patterns after Stroke: Correlates of a VR Action Execution and Observation Task},
        proceedings={REHAB 2014},
        publisher={ICST},
        proceedings_a={REHAB},
        year={2014},
        month={7},
        keywords={action execution action observation eye gaze stroke virtual reality},
        doi={10.4108/icst.pervasivehealth.2014.255288}
    }
    
  • Júlio Alves
    Athanasios Vourvopoulos
    Alexandre Bernardino
    Sergi Bermúdez i Badia
    Year: 2014
    Eye Gaze Patterns after Stroke: Correlates of a VR Action Execution and Observation Task
    REHAB
    ICST
    DOI: 10.4108/icst.pervasivehealth.2014.255288
Júlio Alves1,*, Athanasios Vourvopoulos1, Alexandre Bernardino2, Sergi Bermúdez i Badia1
  • 1: Madeira-ITI, Universidade da Madeira
  • 2: Instituto de Sistemas e Robótica, Instituto Superior Técnico
*Contact email: juliomalves@gmail.com

Abstract

The concept of a partially shared neural circuitry between action observation and action execution in healthy participants has been demonstrated through a number of studies. However, little research has been done in this regard utilizing eye movement metrics in rehabilitation contexts. In this study we approach action observation and action execution by combining a virtual environment and eye tracking technology. Participants consisted of stroke survivors, and were required to perform a simple reach-and-grab and place-and-release task with both their paretic and non-paretic arm. Results showed congruency in gaze metrics between action execution and action observation, for distribution and duration of gaze events. Furthermore, in action observation, longer smooth pursuit segments were detected when observing the representation of the paretic arm, thus providing evidence that the affected circuitry may be activated during observation of the simulated action. These results can lead to novel rehabilitation methods using virtual reality technology.