9th International Conference on Pervasive Computing Technologies for Healthcare

Research Article

Data fusion for automated pain recognition

Download809 downloads
  • @INPROCEEDINGS{10.4108/icst.pervasivehealth.2015.259166,
        author={Steffen Walter and Sascha Gruss and Markus K\aa{}chele and Friedhelm Schwenker and philipp Werner and Ayoub Al-Hamadi and Adriano Andrade and Gustavo Moreira and Harald Traue},
        title={Data fusion for automated pain recognition  },
        proceedings={9th International Conference on Pervasive Computing Technologies for Healthcare},
        publisher={IEEE},
        proceedings_a={PERVASIVEHEALTH},
        year={2015},
        month={8},
        keywords={automated pain recognition biopotentials video multimodal data machine learning data fusion},
        doi={10.4108/icst.pervasivehealth.2015.259166}
    }
    
  • Steffen Walter
    Sascha Gruss
    Markus Kächele
    Friedhelm Schwenker
    philipp Werner
    Ayoub Al-Hamadi
    Adriano Andrade
    Gustavo Moreira
    Harald Traue
    Year: 2015
    Data fusion for automated pain recognition
    PERVASIVEHEALTH
    ICST
    DOI: 10.4108/icst.pervasivehealth.2015.259166
Steffen Walter1,*, Sascha Gruss1, Markus Kächele1, Friedhelm Schwenker1, philipp Werner2, Ayoub Al-Hamadi2, Adriano Andrade3, Gustavo Moreira3, Harald Traue1
  • 1: University of Ulm
  • 2: University of Magdeburg
  • 3: University of Uberlandia
*Contact email: steffen.walter@uni-ulm.de

Abstract

Conservative methods of pain scales do not allow for objective and robust measurement, which are restricted to patients with “normal” communication abilities. If valid measurement of the pain is not possible, treating the pain may lead to cardiac stress in risk patients, under perfusion of the operating field, over- or under-usage of analgesics and other problems in acute or chronic pain conditions. Pervasive computing technologies via biopotential and behavioral parameters may represent a solution of robust pain recognition in clinical context and everyday life. In this work, multi-modal fusion of video and biopotential signals is used to recognize pain in a person-independent scenario. For this purpose, participants were recruited to subject to painful heat stimuli under controlled conditions. Subsequently, a multitude of features via biopotentials and behavior signals has been extracted and selected from the available modalities. Biopotential and video features were fused with an early and late fusion, we could show that the classification between baseline vs. tolerance threshold has an accuracy of 80 % via late fusion. The data support the concept of automated and objected pain recognition of experimental pain. There are plans for a clinical project in which detection will occur postoperatively in humans.