The 8th EAI International Conference on Mobile Computing, Applications and Services

Research Article

vTrack: Envisioning a Virtual Trackpad Interface through mm-level Sound Source Localization for Mobile Interaction

  • @INPROCEEDINGS{10.4108/eai.30-11-2016.2267095,
        author={Seungeun Chung and Injong Rhee},
        title={vTrack: Envisioning a Virtual Trackpad Interface through mm-level Sound Source Localization for Mobile Interaction},
        proceedings={The 8th EAI International Conference on Mobile Computing, Applications and Services},
        publisher={ACM},
        proceedings_a={MOBICASE},
        year={2016},
        month={12},
        keywords={acoustic signal tracking mobile interaction mobile sensing},
        doi={10.4108/eai.30-11-2016.2267095}
    }
    
  • Seungeun Chung
    Injong Rhee
    Year: 2016
    vTrack: Envisioning a Virtual Trackpad Interface through mm-level Sound Source Localization for Mobile Interaction
    MOBICASE
    ACM
    DOI: 10.4108/eai.30-11-2016.2267095
Seungeun Chung1,*, Injong Rhee1
  • 1: North Carolina State University
*Contact email: schung5@ncsu.edu

Abstract

Touchscreens on mobile devices allow intuitive interactions through haptic communication, but their limited workspace confines user experiences. In this paper, we envision a virtual trackpad interface that tracks user input on any surface near the mobile device. We adopt acoustic signal as the only medium used for the interaction, which can be handled by lightweight signal processing using inexpensive sensors on mobile devices. In our vTrack prototype, the peripheral device simply emits inaudible acoustic signals through a loudspeaker, while the receiving device performs sound source localization by leveraging a multi-channel microphone array. We build a fingerprint-based localization model using various cues, such as time difference of arrival, angle of arrival, and power spectrum density of the audio signal. The vTrack system integrates the frequency difference of arrival incurred by the Doppler shift to track the sound source in motion. Finally, the position estimations are fed into the extended Kalman filter to reduce errors and smooth the output. We implement our system on Android devices and validate its feasibility. Our extensive experiments show that vTrack achieves millimeter-level accuracy in the moving sound source scenario.