About | Contact Us | Register | Login
ProceedingsSeriesJournalsSearchEAI
amsys 17(13): e1

Research Article

Position Based Visual Servoing control of a Wheelchair Mounter Robotic Arm using Parallel Tracking and Mapping of task objects

Download1414 downloads
Cite
BibTeX Plain Text
  • @ARTICLE{10.4108/eai.17-5-2017.152545,
        author={Alessandro Palla and Gabriele Meoni and Luca Fanucci and Alessandro Frigerio},
        title={Position Based Visual Servoing control of a Wheelchair Mounter Robotic Arm using Parallel Tracking and Mapping of task objects},
        journal={EAI Endorsed Transactions on Ambient Systems},
        volume={4},
        number={13},
        publisher={EAI},
        journal_a={AMSYS},
        year={2017},
        month={5},
        keywords={Robotic Arm, Power Wheelchair, Visual Servoing, PBVS, Eye-in-Hand, Computer Vision, SIFT, Features extraction, PTAM, ROS, Human Machine Interface, Assistive Technology, Open-source},
        doi={10.4108/eai.17-5-2017.152545}
    }
    
  • Alessandro Palla
    Gabriele Meoni
    Luca Fanucci
    Alessandro Frigerio
    Year: 2017
    Position Based Visual Servoing control of a Wheelchair Mounter Robotic Arm using Parallel Tracking and Mapping of task objects
    AMSYS
    EAI
    DOI: 10.4108/eai.17-5-2017.152545
Alessandro Palla1,*, Gabriele Meoni1, Luca Fanucci1, Alessandro Frigerio1
  • 1: University of Pisa
*Contact email: alessandro.palla@for.unipi.it

Abstract

In the last few years power wheelchairs have been becoming the only device able to provide autonomy and independence to people with motor skill impairments. In particular, many power wheelchairs feature robotic arms for gesture emulation, like the interaction with objects. However, complex robotic arms often require a joystic to be controlled; this feature make the arm hard to be controlled by impaired users. Paradoxically, if the user were able to proficiently control such devices, he would not need them. For that reason, this paper presents a highly autonomous robotic arm, designed in order to minimize the e ort necessary for the control of the arm. In order to do that, the arm feature an easy to use human - machine interface and is controlled by Computer Vison algorithm, implementing a Position Based Visual Servoing (PBVS) control. It was realized by extracting features by the camera and fusing them with the distance from the target, obtained by a proximity sensor. The Parallel Tracking and Mapping (PTAM) algorithm was used to find the 3D position of the task object in the camera reference system. The visual servoing algorithm was implemented in an embedded platform, in real time. Each part of the control loop was developed in Robotic Operative System (ROS) Environment, which allows to implement the previous algorithms as di erent nodes. Theoretical analysis, simulations and in system measurements proved the e ectiveness of the proposed solution.

Keywords
Robotic Arm, Power Wheelchair, Visual Servoing, PBVS, Eye-in-Hand, Computer Vision, SIFT, Features extraction, PTAM, ROS, Human Machine Interface, Assistive Technology, Open-source
Received
2017-02-28
Accepted
2017-05-11
Published
2017-05-17
Publisher
EAI
http://dx.doi.org/10.4108/eai.17-5-2017.152545

Copyright © 2017 Alessandro Palla et al., licensed to EAI. This is an open access article distributed under the terms of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/3.0/), which permits unlimited use, distribution and reproduction in any medium so long as the original work is properly cited.

EBSCOProQuestDBLPDOAJPortico
EAI Logo

About EAI

  • Who We Are
  • Leadership
  • Research Areas
  • Partners
  • Media Center

Community

  • Membership
  • Conference
  • Recognition
  • Sponsor Us

Publish with EAI

  • Publishing
  • Journals
  • Proceedings
  • Books
  • EUDL