Research Article
Analysis of Motion Patterns for Recognition of Human Activities
@INPROCEEDINGS{10.4108/eai.14-10-2015.2261719, author={Majid Janidarmian and Atena Roshan Fekr and Katarzyna Radecka and Zeljko Zilic and Louis Ross}, title={Analysis of Motion Patterns for Recognition of Human Activities}, proceedings={5th EAI International Conference on Wireless Mobile Communication and Healthcare - "Transforming healthcare through innovations in mobile and wireless technologies"}, publisher={ACM}, proceedings_a={MOBIHEALTH}, year={2015}, month={12}, keywords={wearable sensors activity recognition machine learning classification performance optimization}, doi={10.4108/eai.14-10-2015.2261719} }
- Majid Janidarmian
Atena Roshan Fekr
Katarzyna Radecka
Zeljko Zilic
Louis Ross
Year: 2015
Analysis of Motion Patterns for Recognition of Human Activities
MOBIHEALTH
ICST
DOI: 10.4108/eai.14-10-2015.2261719
Abstract
Automatic recognition of human activity is one of the most important and challenging open areas of research in context-aware and physical training applications. The activity profiling systems normally use wearable sensors to record motion patterns over extended periods of time. The performance of these systems depends on the activity set, training data quality, extracted features and learning algorithms. In this paper, we describe the development of efficient activity recognition techniques using wearable motion sensors. An extensive evaluation is provided on three well-known classifiers with light-weight time series features to distinguish among thirty three different fitness activities. The effects of the segmentation, sensor placement as well as different sampling rates on the classification performance are discussed. The experimental results conducted with 17 subjects show an improvement of the classification accuracy compared with the previous work. In addition, the statistical analysis of the investigated models quantifies relative effects of the accelerometer axes reductions on the classification performance.