
Research Article
Design of an Interactive LiDAR-Vision Integrated Navigation System
@INPROCEEDINGS{10.1007/978-3-030-82562-1_45, author={Jidong Feng and Wanfeng Ma and Tongqian Liu and Yuan Xu}, title={Design of an Interactive LiDAR-Vision Integrated Navigation System}, proceedings={Multimedia Technology and Enhanced Learning. Third EAI International Conference, ICMTEL 2021, Virtual Event, April 8--9, 2021, Proceedings, Part I}, proceedings_a={ICMTEL}, year={2021}, month={7}, keywords={LiDAR Visual localization system Mobile robot localization}, doi={10.1007/978-3-030-82562-1_45} }
- Jidong Feng
Wanfeng Ma
Tongqian Liu
Yuan Xu
Year: 2021
Design of an Interactive LiDAR-Vision Integrated Navigation System
ICMTEL
Springer
DOI: 10.1007/978-3-030-82562-1_45
Abstract
In order to further improve the accuracy of indoor navigation for mobile robots, this paper introduces the hardware design of a mobile robot combining Light Laser Detection and Ranging (LiDAR) with visual localization system. In this system, the LiDAR-based localization system and visual localization system run in parallel. The LiDAR-based localization system runs on the ROS system for measuring the LiDAR-based position, which helps to solve the deficiencies of low visual navigation frequency.
Visual location makes up for the weakness in short detection range and little data of LiDAR. The pose data obtained by the two methods are filtered by Kalman [1] filter and reweighted fusion to obtain a more accurate position and motion trajectory of the mobile robot. The results from indoor robot movement test show that the mobile robot system integrating visual and LiDAR positioning has an obvious improvement in regard to accuracy compared to other methods.