About | Contact Us | Register | Login
ProceedingsSeriesJournalsSearchEAI
Mobile Computing, Applications, and Services. 4th International Conference, MobiCASE 2012, Seattle, WA, USA, October 11-12, 2012. Revised Selected Papers

Research Article

Automatic Annotation of Daily Activity from Smartphone-Based Multisensory Streams

Download(Requires a free EAI acccount)
583 downloads
Cite
BibTeX Plain Text
  • @INPROCEEDINGS{10.1007/978-3-642-36632-1_19,
        author={Jihun Hamm and Benjamin Stone and Mikhail Belkin and Simon Dennis},
        title={Automatic Annotation of Daily Activity from Smartphone-Based Multisensory Streams},
        proceedings={Mobile Computing, Applications, and Services. 4th International Conference, MobiCASE 2012, Seattle, WA, USA, October 11-12, 2012. Revised Selected Papers},
        proceedings_a={MOBICASE},
        year={2013},
        month={2},
        keywords={mobile computing lifelogging activity recognition automatic annotation},
        doi={10.1007/978-3-642-36632-1_19}
    }
    
  • Jihun Hamm
    Benjamin Stone
    Mikhail Belkin
    Simon Dennis
    Year: 2013
    Automatic Annotation of Daily Activity from Smartphone-Based Multisensory Streams
    MOBICASE
    Springer
    DOI: 10.1007/978-3-642-36632-1_19
Jihun Hamm1, Benjamin Stone1, Mikhail Belkin1, Simon Dennis1
  • 1: The Ohio State University

Abstract

We present a system for automatic annotation of daily experience from multisensory streams on smartphones. Using smartphones as platform facilitates collection of naturalistic daily activity, which is difficult to collect with multiple on-body sensors or array of sensors affixed to indoor locations. However, recognizing daily activities in unconstrained settings is more challenging than in controlled environments: 1) multiples heterogeneous sensors equipped in smartphones are noisier, asynchronous, vary in sampling rates and can have missing data; 2) unconstrained daily activities are continuous, can occur concurrently, and have fuzzy onset and offset boundaries; 3) ground-truth labels obtained from the user’s self-report can be erroneous and accurate only in a coarse time scale. To handle these problems, we present in this paper a flexible framework for incorporating heterogeneous sensory modalities combined with state-of-the-art classifiers for sequence labeling. We evaluate the system with real-life data containing 11721 minutes of multisensory recordings, and demonstrate the accuracy and efficiency of the proposed system for practical lifelogging applications.

Keywords
mobile computing lifelogging activity recognition automatic annotation
Published
2013-02-06
http://dx.doi.org/10.1007/978-3-642-36632-1_19
Copyright © 2012–2025 ICST
EBSCOProQuestDBLPDOAJPortico
EAI Logo

About EAI

  • Who We Are
  • Leadership
  • Research Areas
  • Partners
  • Media Center

Community

  • Membership
  • Conference
  • Recognition
  • Sponsor Us

Publish with EAI

  • Publishing
  • Journals
  • Proceedings
  • Books
  • EUDL