amsys 16(12): e2

Research Article

Vision-based Event Detection of the Sit-to-Stand Transition

Download1020 downloads
  • @ARTICLE{10.4108/eai.14-10-2015.2261631,
        author={Victor Shia and Ruzena Bajcsy},
        title={Vision-based Event Detection of the Sit-to-Stand Transition},
        journal={EAI Endorsed Transactions on Ambient Systems},
        volume={3},
        number={12},
        publisher={ACM},
        journal_a={AMSYS},
        year={2015},
        month={12},
        keywords={sit-to-stand, postural transition},
        doi={10.4108/eai.14-10-2015.2261631}
    }
    
  • Victor Shia
    Ruzena Bajcsy
    Year: 2015
    Vision-based Event Detection of the Sit-to-Stand Transition
    AMSYS
    EAI
    DOI: 10.4108/eai.14-10-2015.2261631
Victor Shia1,*, Ruzena Bajcsy1
  • 1: University of California, Berkeley
*Contact email: vshia@eecs.berkeley.edu

Abstract

Sit-to-stand (STS) motions are one of the most important activities of daily living as they serve as a precursor to mobility and walking. However, there exist no standard method of segmenting STS motions. This is partially due to the variety of different sensors and modalities used to study the STS motion such as force plate, vision, and accelerometers, each providing different types of data, and the variability of the STS motion in video data. In this work, we present a method using motion capture to detect events in the STS motion by estimating ground reaction forces, thereby eliminating the variability in joint angles from visual data. We illustrate the accuracy of this method with 10 subjects with an average difference of 16.5ms in event times obtained via motion capture vs force plate. This method serves as a proof of concept for detecting events in the STS motion via video which are comparable to those obtained via force plate.