Preserved in Portico This version is not peer-reviewed
Computer Vision for Elderly Care Based on Hand Gestures
: Received: 24 July 2020 / Approved: 26 July 2020 / Online: 26 July 2020 (02:07:09 CEST)
A peer-reviewed article of this Preprint also exists.
Journal reference: Computers 2021, 10, 5
Hand gestures may play an important role in medical applications for health care of elderly people, where providing a natural interaction for different requests can be executed by making specific gestures. In this study we explored three different scenarios using a Microsoft Kinect V2 depth sensor then evaluated the effectiveness of the outcomes. The first scenario utilized the default system embedded in the Kinect V2 sensor, which depth metadata gives 11 parameters related to the tracked body with five gestures for each hand. The second scenario used joint tracking provided by Kinect depth metadata and depth threshold together to enhance hand segmentation and efficiently recognize the number of fingers extended. The third scenario used a simple convolutional neural network with joint tracking by depth metadata to recognize five categories of gestures. In this study, deaf-mute elderly people execute five different hand gestures to indicate a specific request, such as needing water, meal, toilet, help and medicine. Then, the requests were sent to the care provider’s smartphone because elderly people could not execute any activity independently. The system transferred these requests as a message through the global system for mobile communication (GSM) using a microcontroller.
elderly care; hand gesture; computer vision system; Microsoft Kinect depth sensor; Arduino Nano Microcontroller; global system for mobile communication (GSM)
ENGINEERING, General Engineering
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
We encourage comments and feedback from a broad range of readers. See criteria for comments and our diversity statement.