11th EAI International Conference on Mobile Multimedia Communications

Research Article

Understanding and Improving Deep Neural Network for Activity Recognition

Download19 downloads
  • @INPROCEEDINGS{10.4108/eai.21-6-2018.2276632,
        author={Li Xue and Si Xiandong and Nie Lanshun and Li Jiazhen and Ding Renjie and Zhan Dechen and Chu Dianhui},
        title={Understanding and Improving Deep Neural Network for Activity Recognition},
        proceedings={11th EAI International Conference on Mobile Multimedia Communications},
        publisher={EAI},
        proceedings_a={MOBIMEDIA},
        year={2018},
        month={9},
        keywords={human activity recognition dnn feature extraction data feature visualization},
        doi={10.4108/eai.21-6-2018.2276632}
    }
    
  • Li Xue
    Si Xiandong
    Nie Lanshun
    Li Jiazhen
    Ding Renjie
    Zhan Dechen
    Chu Dianhui
    Year: 2018
    Understanding and Improving Deep Neural Network for Activity Recognition
    MOBIMEDIA
    EAI
    DOI: 10.4108/eai.21-6-2018.2276632
Li Xue1, Si Xiandong1, Nie Lanshun1,*, Li Jiazhen1, Ding Renjie1, Zhan Dechen1, Chu Dianhui1
  • 1: Harbin Institute of Technology
*Contact email: nls@hit.edu.cn

Abstract

Activity recognition has become a popular research branch in the field of pervasive computing in recent years. A large number of experiments can be obtained that activity sensor-based data’s characteristic in activity recognition is variety, volume, and velocity. Deep learning technology, together with its various models, is one of the most effective ways of working on activity data. Nevertheless, there is no clear understanding of why it performs so well or how to make it more effective. In order to solve this problem, first, we applied convolution neural network on Human Activity Recognition Using Smartphones Data Set. Sencond, we realized the visualization of the sensor-based activity’s data features extracted from the neural network. Then we had in-depth analysis of the visualization of features,explored the relationship between activity and features, and analyzed how Neural Networks identify activity based on these features. After that, we extracted the significant features related to the activities and sent the features to the DNN-based fusion model, which improved the classification rate to 96.1%. This is the first work to our knowledge that visualizes abstract sensor-based activity data features. Based on the results, the method proposed in the paper promises to realize the accurate classification of sensor-based activity recognition.