Mobile and Ubiquitous Systems: Computing, Networking, and Services. 10th International Conference, MOBIQUITOUS 2013, Tokyo, Japan, December 2-4, 2013, Revised Selected Papers

Research Article

AcTrak - Unobtrusive Activity Detection and Step Counting Using Smartphones

Download
965 downloads
  • @INPROCEEDINGS{10.1007/978-3-319-11569-6_35,
        author={Vivek Chandel and Anirban Choudhury and Avik Ghose and Chirabrata Bhaumik},
        title={AcTrak - Unobtrusive Activity Detection and Step Counting Using Smartphones},
        proceedings={Mobile and Ubiquitous Systems: Computing, Networking, and Services. 10th International Conference, MOBIQUITOUS 2013, Tokyo, Japan, December 2-4, 2013,  Revised Selected Papers},
        proceedings_a={MOBIQUITOUS},
        year={2014},
        month={12},
        keywords={Step counting Activity detection Unobtrusive sensing Mobile sensing Mobile computing mHealth},
        doi={10.1007/978-3-319-11569-6_35}
    }
    
  • Vivek Chandel
    Anirban Choudhury
    Avik Ghose
    Chirabrata Bhaumik
    Year: 2014
    AcTrak - Unobtrusive Activity Detection and Step Counting Using Smartphones
    MOBIQUITOUS
    Springer
    DOI: 10.1007/978-3-319-11569-6_35
Vivek Chandel1,*, Anirban Choudhury1,*, Avik Ghose1,*, Chirabrata Bhaumik1,*
  • 1: Tata Consultancy Services
*Contact email: vivek.chandel@tcs.com, anirban.duttachoudhury@tcs.com, avik.ghose@tcs.com, c.bhaumik@tcs.com

Abstract

In this paper we introduce “AcTrak”, a system that provides training-free and orientation-and-placement-independent step-counting and activity recognition on commercial mobile phones, using only 3D accelerometer. The proposed solution uses “step-frequency” as a feature to classify various activities. In order to filter out noise generated due to normal handling of the phone, while the user is otherwise physically stationary, AcTrak is armed with a novel algorithm for step validation termed as Individual Peak Analysis (IPA). IPA uses peak-height and inter-peak interval as features. AcTrak provides realtime step count. It also classifies current activity, and tags each activity with the associated steps, resulting in a detailed analysis of activity recognition. Using our model, a step-count accuracy of 98.9 % is achieved. Further, an accuracy of 95 % is achieved when classifying stationary, walking and running/jogging. When brisk-walking is added to the activity set, still a reasonable level of accuracy is achieved. Since AcTrak is largely orientation and position agnostic, and requires no prior training, this makes our approach truly ubiquitous. Classification of step-based activity is done as walking, brisk-walking and running (includes jogging). So, after a session of workout, the subject can easily self-assess his/her accomplishment.