Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Vision-driven collaborative robotic grasping system tele-operated by surface electromyography

Version 1 : Received: 27 June 2018 / Approved: 27 June 2018 / Online: 27 June 2018 (15:01:06 CEST)

A peer-reviewed article of this Preprint also exists.

Úbeda, A.; Zapata-Impata, B.S.; Puente, S.T.; Gil, P.; Candelas, F.; Torres, F. A Vision-Driven Collaborative Robotic Grasping System Tele-Operated by Surface Electromyography. Sensors 2018, 18, 2366. Úbeda, A.; Zapata-Impata, B.S.; Puente, S.T.; Gil, P.; Candelas, F.; Torres, F. A Vision-Driven Collaborative Robotic Grasping System Tele-Operated by Surface Electromyography. Sensors 2018, 18, 2366.

Abstract

This paper presents a system that merges computer vision and surface electromyography techniques to carry out grasping tasks. To perform this, the vision-driven system is used to compute pre-grasping poses of the robotic system based on the analysis of tridimensional object features. Then, the human operator can correct the pre-grasping pose of the robot using surface electromyographic signals from the forearm during wrist flexion and extension. Weak wrist flexions and extensions allow a fine adjustment of the robotic system to grasp the object and finally, when the operator considers that the grasping position is optimal, a strong flexion is performed to initiate the grasping of the object. The system has been tested with several subjects to check its performance showing a grasping accuracy of around 95% of the attempted grasps which increases by around a 9% the grasping accuracy of previous experiments in which electromyographic control was not implemented.

Keywords

surface electromyography; computer vision; grasping; assistive robotics

Subject

Engineering, Control and Systems Engineering

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.