Research Article
SaSYS: A Swipe Gesture-Based System for Exploring Urban Environments for the Visually Impaired
@INPROCEEDINGS{10.1007/978-3-319-05452-0_5, author={Jee-Eun Kim and Masahiro Bessho and Noboru Koshizuka and Ken Sakamura}, title={SaSYS: A Swipe Gesture-Based System for Exploring Urban Environments for the Visually Impaired}, proceedings={Mobile Computing, Applications, and Services. 5th International Conference, MobiCASE 2013, Paris, France, November 7-8, 2013, Revised Selected Papers}, proceedings_a={MOBICASE}, year={2014}, month={6}, keywords={Accessibility mobile devices visually impaired touchscreens location-based services}, doi={10.1007/978-3-319-05452-0_5} }
- Jee-Eun Kim
Masahiro Bessho
Noboru Koshizuka
Ken Sakamura
Year: 2014
SaSYS: A Swipe Gesture-Based System for Exploring Urban Environments for the Visually Impaired
MOBICASE
Springer
DOI: 10.1007/978-3-319-05452-0_5
Abstract
Exploring and learning an environment is a particularly challenging issue faced by visually impaired people. Existing interaction techniques for allowing users to learn an environment may not be useful while traveling because they often use dedicated hardware or require users to focus on tactile or auditory feedback. In this paper, we introduce an intuitive interaction technique for selecting areas of interests in urban environments by performing simple swipe gestures on touchscreen. Based on the swipe-based interaction, we developed SaSYS, a location-aware system that enables users to discover points of interest (POI) around them using off-the-shelf smartphones. Our approach can be easily implemented on handheld devices without requiring any dedicated hardware and having users to constantly focus on tactile or auditory feedback. SaSYS also provides a fine-grained control over Text-to-Speech (TTS). Our user study shows that 9 of 11 users preferred swipe-based interaction to existing pointing-based interaction.