
Research Article
Angular Position Estimation for Human-Following and Robot Navigation
@INPROCEEDINGS{10.1007/978-3-031-31275-5_26, author={Isaac Asante and Lau Bee Theng and Mark Tee Kit Tsun}, title={Angular Position Estimation for Human-Following and Robot Navigation}, proceedings={Smart Grid and Internet of Things. 6th EAI International Conference, SGIoT 2022, TaiChung, Taiwan, November 19-20, 2022, Proceedings}, proceedings_a={SGIOT}, year={2023}, month={5}, keywords={Robot Navigation Instance Segmentation Object Detection}, doi={10.1007/978-3-031-31275-5_26} }
- Isaac Asante
Lau Bee Theng
Mark Tee Kit Tsun
Year: 2023
Angular Position Estimation for Human-Following and Robot Navigation
SGIOT
Springer
DOI: 10.1007/978-3-031-31275-5_26
Abstract
Mobile robot navigation and human following are two related areas under the field of robotics that have garnered a lot of interest over the years, due to their advantages in the real world, in various settings. Modern techniques depending exclusively on computer vision for environmental data input commonly rely on state-of-the-art object detection methods to obtain data about the location of detected objects in a scene. However, these detection frameworks do not directly provide the angular position of obstacles and human targets in images. In this research project, the Mask R-CNN instance segmentation framework detects static objects and humans in an environment. A chain of algorithms is then used to transform the image’s content and pixel information into a one-dimensional array that can be mapped to the robot’s field of view. The findings show that the result can aid a mobile robot in estimating the angular position of obstacles and a human in a scene, which is necessary for collision-free navigation and robot-human following in an unknown environment. The proposed method also shows adaptivity as it works outdoors and indoors under poor lighting conditions. It can be used as a standalone algorithm in robotics simulations with webcams and static images.