ct 17(11): e1

Research Article

A Comparison of Gamified HCI Studies with Lab and Crowd Participants

Download932 downloads
  • @ARTICLE{10.4108/eai.5-9-2017.153058,
        author={Hendrik Knoche and Allan Christensen and Simon Andr\^{e} Pedersen},
        title={A Comparison of Gamified HCI Studies with Lab and Crowd Participants},
        journal={EAI Endorsed Transactions on Creative Technologies},
        volume={4},
        number={11},
        publisher={EAI},
        journal_a={CT},
        year={2017},
        month={4},
        keywords={HCI, Crowdsourcing, gamification, device human resolution, DHR, touch interaction, touch screen, dragging, acquiescience bias},
        doi={10.4108/eai.5-9-2017.153058}
    }
    
  • Hendrik Knoche
    Allan Christensen
    Simon André Pedersen
    Year: 2017
    A Comparison of Gamified HCI Studies with Lab and Crowd Participants
    CT
    EAI
    DOI: 10.4108/eai.5-9-2017.153058
Hendrik Knoche1,*, Allan Christensen1, Simon André Pedersen1
  • 1: Department of Media Technology, Aalborg University, Rendsburggade 14, 9000 Aalborg, DK
*Contact email: hk@create.aau.dk

Abstract

We compared a game-based experiment carried out in the lab with crowdsourced set ups (informed and uninformed participants) on the device’s human resolution (DHR) - the minimum size for dragging the finger onto a target on a touch screen. Lab participants produced fewer errors than the crowd. From lab participants, we found the smallest selectable target width for dragging onto non-occluded targets with visual target position feedback, was between 2mm and 4mm on mobile touch devices. Performance data on error not time allowed for drawing this conclusion as participants from all groups did not take enough care and time to acquire the targets. The bi-modal performance distributions of crowd participants required filtering data.