Smart Objects and Technologies for Social Good. Second International Conference, GOODTECHS 2016, Venice, Italy, November 30 – December 1, 2016, Proceedings

Research Article

A Smart Wearable Navigation System for Visually Impaired

Download
327 downloads
  • @INPROCEEDINGS{10.1007/978-3-319-61949-1_35,
        author={Michael Trent and Ahmed Abdelgawad and Kumar Yelamarthi},
        title={A Smart Wearable Navigation System for Visually Impaired},
        proceedings={Smart Objects and Technologies for Social Good. Second International Conference, GOODTECHS 2016, Venice, Italy, November 30 -- December 1, 2016, Proceedings},
        proceedings_a={GOODTECHS},
        year={2017},
        month={7},
        keywords={Ultrasonic sensor Audio feedback Visual impairment Navigation assistance},
        doi={10.1007/978-3-319-61949-1_35}
    }
    
  • Michael Trent
    Ahmed Abdelgawad
    Kumar Yelamarthi
    Year: 2017
    A Smart Wearable Navigation System for Visually Impaired
    GOODTECHS
    Springer
    DOI: 10.1007/978-3-319-61949-1_35
Michael Trent1,*, Ahmed Abdelgawad1,*, Kumar Yelamarthi1,*
  • 1: Central Michigan University
*Contact email: trent1ma@cmich.edu, abdel1a@cmich.edu, yelam1k@cmich.edu

Abstract

Smart devices are becoming more common in our daily lives; they are being incorporated in buildings, houses, cars, and public places. Moreover, this technological revolution, known as the Internet of Things (IoT), brings us new opportunities. A variety of navigation systems has been developed to assist blind people. Yet, none of these systems are connected to the IoT. The objective of this paper is to implement a low cost and low power IoT navigation system for blind people. The system consists of an array of ultrasonic sensors that are mounted on a waist belt to survey the scene, iBeacons to identify the location, and a Raspberry Pi to do the data processing. The Raspberry Pi uses the ultrasonic sensors to detect the obstacles, and provide audio cues via a Bluetooth headset to the user. iBeacons will be deployed at different locations with each having a unique ID. In the cloud, there is a database for all the iBeacons attached with the corresponding information e.g. address and information about the place. The Raspberry Pi detects the iBeacon’s ID and sends it to the cloud, accordingly the cloud sends back the information attached to this ID to the Raspberry Pi that converts the text to audio and plays it via a Bluetooth headset to the user. Tests demonstrate that the system is accurate within the threshold radius and functions as a navigational assistant.