Smart City 360°. First EAI International Summit, Smart City 360°, Bratislava, Slovakia and Toronto, Canada, October 13-16, 2015. Revised Selected Papers

Research Article

Using Visual Lane Detection to Control Steering in a Self-driving Vehicle

Download
1982 downloads
  • @INPROCEEDINGS{10.1007/978-3-319-33681-7_77,
        author={Kevin McFall},
        title={Using Visual Lane Detection to Control Steering in a Self-driving Vehicle},
        proceedings={Smart City 360°. First EAI International Summit, Smart City 360°, Bratislava, Slovakia and Toronto, Canada, October 13-16, 2015. Revised Selected Papers},
        proceedings_a={SMARTCITY360},
        year={2016},
        month={6},
        keywords={Self-driving vehicle Hough transform Dynamic threshold Inverse perspective transform Temporal integration Angle control},
        doi={10.1007/978-3-319-33681-7_77}
    }
    
  • Kevin McFall
    Year: 2016
    Using Visual Lane Detection to Control Steering in a Self-driving Vehicle
    SMARTCITY360
    Springer
    DOI: 10.1007/978-3-319-33681-7_77
Kevin McFall1,*
  • 1: Kennesaw State University
*Contact email: kmcfall@kennesaw.edu

Abstract

An effective lane detection algorithm employing the Hough transform and inverse perspective mapping to estimate distances in real space is utilized to send steering control commands to a self-driving vehicle. The vehicle is capable of autonomously traversing long stretches of straight road in a wide variety of conditions with the same set of algorithm design parameters. Better performance is hampered by slowly updating inputs to the steering control system. The 5 frames per second (FPS) using a Raspberry Pi 2 for image capture and processing can be improved to 23 FPS with an Odroid XU3. Even at 5 FPS, the vehicle is capable of navigating structured and unstructured roads at slow speed.