
Research Article
A CNN-Based Computer Vision Interface for Prosthetics’ Control
2 downloads
@INPROCEEDINGS{10.1007/978-3-031-06368-8_3, author={Emanuele Lindo Secco and Daniel David McHugh and Neil Buckley}, title={A CNN-Based Computer Vision Interface for Prosthetics’ Control}, proceedings={Wireless Mobile Communication and Healthcare. 10th EAI International Conference, MobiHealth 2021, Virtual Event, November 13--14, 2021, Proceedings}, proceedings_a={MOBIHEALTH}, year={2022}, month={6}, keywords={Prosthetics AI CNN Auto-grasping}, doi={10.1007/978-3-031-06368-8_3} }
- Emanuele Lindo Secco
Daniel David McHugh
Neil Buckley
Year: 2022
A CNN-Based Computer Vision Interface for Prosthetics’ Control
MOBIHEALTH
Springer
DOI: 10.1007/978-3-031-06368-8_3
Abstract
In this paper we present a CNN-based Interface for the control of prosthetic and robotic hand: a CNN visual system is trained with a set of images of daily life object in order to classify and recognize them. Such a classification provides useful information for the configuration of prosthetic and robotic hand: following the training, in fact, a low cost embedded computer combined with a low cost camera on the device (i.e. a prosthetic or robotic hand) can drive the device in order to approach and grasp whatever object belong to the training set.
Copyright © 2021–2025 ICST