About | Contact Us | Register | Login
ProceedingsSeriesJournalsSearchEAI
2nd International ICST Conference on Body Area Networks

Research Article

Gestures are strings: efficient online gesture spotting and classification using string matching

Download1007 downloads
Cite
BibTeX Plain Text
  • @INPROCEEDINGS{10.4108/bodynets.2007.143,
        author={Thomas Stiefmeier and Daniel Roggen and Gerhard Tr\o{}ster},
        title={Gestures are strings: efficient online gesture spotting and classification using string matching},
        proceedings={2nd International ICST Conference on Body Area Networks},
        publisher={ICST},
        proceedings_a={BODYNETS},
        year={2007},
        month={6},
        keywords={Gesture Recognition String-Matching Segmentation Classification},
        doi={10.4108/bodynets.2007.143}
    }
    
  • Thomas Stiefmeier
    Daniel Roggen
    Gerhard Tröster
    Year: 2007
    Gestures are strings: efficient online gesture spotting and classification using string matching
    BODYNETS
    ICST
    DOI: 10.4108/bodynets.2007.143
Thomas Stiefmeier1,*, Daniel Roggen1,*, Gerhard Tröster1,*
  • 1: Wearable Computing Lab ETH Zürich Zürich, Switzerland
*Contact email: stiefmeier@ife.ee.ethz.ch, droggen@ife.ee.ethz.ch, troester@ife.ee.ethz.ch

Abstract

Context awareness is one mechanism that allows wearable computers to provide information proactively, unobtrusively and with minimal user disturbance. Gestures and activities are an important aspect of the user's context. Detection and classification of gestures may be computationally expensive for low-power, miniaturized wearable platforms, such as those that may be integrated into garments. In this paper we introduce a novel method for online and real-time spotting and classification of gestures. Continuous user motion, acquired from a body-worn network of inertial sensors, is represented by strings of symbols encoding motion vectors. Fast string matching techniques, inspired from bioinformatics, spot trained gestures and classify them. Robustness to gesture variability is provided by approximate matching efficiently implemented through dynamic programming. Our method is successfully demonstrated by spotting and classifying the occurrences of trained gestures within a continuous recording of a complex bicycle maintenance task. It executes in real-time on a desktop computer with a fraction of CPU time. Only simple integer arithmetic operations are required, which makes this method ideally suited for implementation on body-worn sensor nodes and real-time operation.

Keywords
Gesture Recognition String-Matching Segmentation Classification
Published
2007-06-10
Publisher
ICST
Modified
2011-09-20
http://dx.doi.org/10.4108/bodynets.2007.143
Copyright © 2007–2025 ICST
EBSCOProQuestDBLPDOAJPortico
EAI Logo

About EAI

  • Who We Are
  • Leadership
  • Research Areas
  • Partners
  • Media Center

Community

  • Membership
  • Conference
  • Recognition
  • Sponsor Us

Publish with EAI

  • Publishing
  • Journals
  • Proceedings
  • Books
  • EUDL