Review
Version 1
Preserved in Portico This version is not peer-reviewed
RGB-D Data-based Action Recognition: A Review
Version 1
: Received: 18 January 2021 / Approved: 19 January 2021 / Online: 19 January 2021 (09:14:30 CET)
A peer-reviewed article of this Preprint also exists.
Shaikh, M. B.; Chai, D. RGB-D Data-Based Action Recognition: A Review. Sensors, 2021, 21, 4246. https://doi.org/10.3390/s21124246. Shaikh, M. B.; Chai, D. RGB-D Data-Based Action Recognition: A Review. Sensors, 2021, 21, 4246. https://doi.org/10.3390/s21124246.
Abstract
Classification of human actions from uni-modal and multi-modal datasets is an ongoing research problem in computer vision. This review is aimed to scope current literature on data-fusion and action-recognition techniques and to identify gaps and future research direction. Success in producing cost-effective and portable vision-based sensors has dramatically increased the number and size of datasets. The rise in number of action recognition datasets intersects with advances in deep-learning architectures and computational support, both of which offer significant research opportunities. Naturally, each action-data modality - such as RGB, depth, skeleton, and infrared - has distinct characteristics; therefore, it is important to exploit the value of each modality for better action recognition. In this article we will focus solely on areas such as data fusion and recognition techniques in the context of vision with a uni-modal and multi-modal perspective. We conclude by discussing research challenges, emerging trends, and possible future research directions.
Keywords
Action Recognition; Deep Learning; Data Fusion
Subject
Computer Science and Mathematics, Artificial Intelligence and Machine Learning
Copyright: This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Comments (0)
We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.
Leave a public commentSend a private comment to the author(s)
* All users must log in before leaving a comment