REHAB 2014

Research Article

Improving voluntary pupil size changes for HCI

Download657 downloads
  • @INPROCEEDINGS{10.4108/icst.pervasivehealth.2014.255312,
        author={Jan Ehlers and Juliane Georgi and Anke Huckauf},
        title={Improving voluntary pupil size changes for HCI},
        proceedings={REHAB 2014},
        publisher={ICST},
        proceedings_a={REHAB},
        year={2014},
        month={7},
        keywords={affective human-computer interface pupil size biofeedback emotions voluntary control},
        doi={10.4108/icst.pervasivehealth.2014.255312}
    }
    
  • Jan Ehlers
    Juliane Georgi
    Anke Huckauf
    Year: 2014
    Improving voluntary pupil size changes for HCI
    REHAB
    ICST
    DOI: 10.4108/icst.pervasivehealth.2014.255312
Jan Ehlers1,*, Juliane Georgi1, Anke Huckauf1
  • 1: University of Ulm
*Contact email: jan.ehlers@uni-ulm.de

Abstract

Previous research (Partala & Surakka, 2003) refers to pupil size as a passive information channel that provides insight into the affective state of the viewer but defies any voluntary control. However, since physiological arousal is influenced by various cognitive processes, we assume that pupil behavior can be brought under control by strategies of emotional regulation and cognitive processing. In the present paper we provide a methodological approach for examining the potentials and limits of active control of pupil dilation. Based on Ekman et al. (2008) we developed methods applying graphical feedback on systematic pupil diameter changes to utilize mechanisms of operant conditioning to gradually enable voluntary control over pupil size. Calculation models are introduced to carefully disentangle task relevant and irrelevant pupil dynamics. Based on mean values, single measuring and interpolation, we conceived computational rules to validate pupil data in real-time and determine criteria for artefact rejection. Extensive research based on the depicted methodology may shed further light on learning achievements related to emotional control and will reveal the potential of pupil-based input channels for the future development of affective Human-Computer Interfaces.