Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

TactiGraph: An Asynchronous Graph Neural Network for Contact Angle Prediction Using Neuromorphic Vision-Based Tactile Sensing

Version 1 : Received: 7 April 2023 / Approved: 10 April 2023 / Online: 10 April 2023 (03:06:15 CEST)

A peer-reviewed article of this Preprint also exists.

Sajwani, H.; Ayyad, A.; Alkendi, Y.; Halwani, M.; Abdulrahman, Y.; Abusafieh, A.; Zweiri, Y. TactiGraph: An Asynchronous Graph Neural Network for Contact Angle Prediction Using Neuromorphic Vision-Based Tactile Sensing. Sensors 2023, 23, 6451. Sajwani, H.; Ayyad, A.; Alkendi, Y.; Halwani, M.; Abdulrahman, Y.; Abusafieh, A.; Zweiri, Y. TactiGraph: An Asynchronous Graph Neural Network for Contact Angle Prediction Using Neuromorphic Vision-Based Tactile Sensing. Sensors 2023, 23, 6451.

Abstract

Vision-based tactile sensors (VBTS) have become the de facto method of giving robots the ability to obtain tactile feedback from their environment. Unlike other solutions to tactile sensing, VBTS offers high spatial resolution feedback without compromising on instrumentation costs or incurring additional maintenance expenses. However, conventional cameras used in VBTS have a fixed update rate and output redundant data, leading to computational overhead downstream. In this work, we present a neuromorphic vision-based tactile sensor (N-VBTS) that employs observations from an event-based camera for contact angle prediction. Particularly, we design and develop a novel graph neural network, dubbed TactiGraph, that asynchronously operates on graphs constructed from raw N-VBTS streams exploiting their spatiotemporal correlations to perform predictions. Although conventional VBTS uses an internal illumination source, TactiGraph is reported to perform efficiently in both scenarios, with and without an internal illumination source. Rigorous experimental results revealed that TactiGraph achieved a mean absolute error of 0.62∘ in predicting the contact angle and was faster and more efficient than both conventional VBTS and other N-VBTS, with lower instrumentation costs. Specifically, N-VBTS requires only 5.5% of the compute-time needed by VBTS when both are tested on the same scenario.

Keywords

tactile sensing; vision-based tactile sensing; event-based vision; robotic manufacturing

Subject

Engineering, Other

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.