Version 1
: Received: 23 February 2023 / Approved: 24 February 2023 / Online: 24 February 2023 (02:52:59 CET)
How to cite:
King, O. Heart Rate Estimation by Video-Based Reflec-tance Photoplethysmography . Preprints2023, 2023020415. https://doi.org/10.20944/preprints202302.0415.v1.
King, O. Heart Rate Estimation by Video-Based Reflec-tance Photoplethysmography . Preprints 2023, 2023020415. https://doi.org/10.20944/preprints202302.0415.v1.
Cite as:
King, O. Heart Rate Estimation by Video-Based Reflec-tance Photoplethysmography . Preprints2023, 2023020415. https://doi.org/10.20944/preprints202302.0415.v1.
King, O. Heart Rate Estimation by Video-Based Reflec-tance Photoplethysmography . Preprints 2023, 2023020415. https://doi.org/10.20944/preprints202302.0415.v1.
Abstract
Non-invasive heart rate (HR) monitoring is important in clinical settings as it plays a critical role in diagnosing a range of health conditions and assessing well-being. Presently, the gold standards for HR measurement are all based on sensors which require skin contact. Apart from inconvenience, contact sensors have proven problematic in certain scenarios – they cannot be used when mechanical isolation of the patient is imperative (burn victims, patients with shaky hands and feet), cause skin damage to premature babies in the ICU and increase the risk of spreading infections. Non-contact HR monitoring using a camera has been recently shown to be a viable alternative. It is now possible to record cardiac-synchronous blood volume variations from facial videos of human subjects under ambient lighting. These variations produce corresponding changes in skin reflectance which can be extracted as a raw reflectance photoplethysmography (rPPG) signal and processed to reveal HR. In this project, an algorithmic framework for webcam-based HR detection was successfully implemented in MATLAB. The investigation was based on 100 self-captured videos (dark-skinned subject) and 48 videos (from 12 subjects, all but one fair-skinned) obtained from COHFACE – an online database of facial videos and corresponding physiological signals. While the performance metrics (mean error, SNR) of the rPPG signals obtained from the self-captured videos were poor (best case mean error of 22%), they were good enough to demonstrate the success of the implementation. The poor results were primarily imputed to skin tone as rPPG SNR is known to be particularly low for dark tones. The results of the COHFACE videos were far superior, with mean error ranging from 3% to 15% (among 8 different rPPG signals) and 0% to 9% under ambient and dedicated lighting, respectively. This investigation sets the foundation for future research directed at optimizing rPPG performance metrics for dark-skinned subjects.
Keywords
reflectance PPG; reflectance photoplethysmography; heart rate estimation; video
Subject
MEDICINE & PHARMACOLOGY, General Medical Research
Copyright:
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.