Preprint
Article

This version is not peer-reviewed.

Designing Reproducible Test Environments for rPPG: A System for Camera Sensor Response Validation

Submitted:

23 January 2026

Posted:

26 January 2026

You are already at the latest version

Abstract
Remote photoplethysmography (rPPG) enables non-contact vital sign measurements using standard smart devices cameras, opening up the potential of scalable health applications on consumer smart devices. However, rPPG signal quality is highly sensitive to camera sensor characteristics and image processing pipelines, which can vary between devices. This variation limits reproducibility and generalisation of rPPG based algorithms beyond specific hardware platforms. This work presents a reproducible test environment for the validation of the camera sensor response in the context of rPPG signals. A microcontroller driven illumination system and mechanically constrained setup are used to generate controlled, repeatable optical signals and ensure a consistent setup. Two characterisation tests are introduced: a time domain morphology analysis and a frequency domain attenuation analysis. To assess sensor similarity pulse timing consistency, the fitness score and normalised frequency responses are compared. This method is applied to multiple consumer devices and demonstrates consistent camera response patterns under the controlled test conditions. By explicitly addressing the validation of the camera sensor and image processing pipeline of different devices, this work supports the development of deployable, accurate rPPG based vital sign applications across a wide range of consumer devices.
Keywords: 
;  ;  ;  ;  
Subject: 
Engineering  -   Other

1. Introduction

Photoplethysmography is an optical measurement technique that captures blood volume changes by analysing variations in light absorption and reflection. It is most commonly known for its use in pulse oximeters to measure pulse rate (PR) and blood oxygen saturation (SpO2), but also in smart wearables. PPG signals are commonly extracted from the ear, wrist or finger, but can be extracted from other sites too. In recent years, remote PPG (rPPG) has emerged, allowing the extraction of PPG signals without direct contact [1] using video recordings of exposed skin, most commonly the face.
The idea of capturing rPPG using standard cameras, especially those integrated into smartphones, has generated significant interest. Camera-based physiological sensing offers the potential for scalable, low-cost deployment in telemedicine as well as large scale screening applications. As a result, significant research efforts have focused on improving rPPG signal extraction through techniques such as colour space analysis, blind source separation, motion compensation, and machine learning–based signal inpainting techniques [2,3,4,5,6].
Under ideal conditions, rPPG has successfully been used for PR and blood pressure (BP) estimation [7,8]. Due to rPPG signal’s low amplitude relative to the overall intensity, they are highly sensitive to disturbances such as variation in lighting and subject motion. Challenges can also occur with darker skin tones due to the higher absorption of light or with facial hair covering the skin. Less commonly addressed is the influence of the camera sensor and its inbuilt image signal processing (ISP) pipeline [9]. Most studies use a single specific device to overcome this, but to create an algorithm suitable for all consumer devices it is critical to understand the camera’s behaviour. There are differences in the physical camera, such as colour filter design or spectral sensitivity, but also potential differences in the image capturing process such as auto-exposure, auto-whitebalance, tone mapping and compression that can complicate the signal interpretation.
Camera characterisation and calibration techniques are well established in optical metrology and imaging science [10,11]. However, these techniques are rarely adapted to the specific requirements of rPPG signal capture, where temporal stability, repeatability, and preservation of low-amplitude physiological signals are important. Moreover, many existing calibration approaches rely on laboratory-grade instrumentation that are not easily transferable to routine validation of consumer cameras at scale.
These limitations demonstrate the need for the development of reproducible test environments for validating camera sensor response in the context of rPPG. Rather than evaluating rPPG algorithms on their own, it is necessary to establish whether a given camera and its associated ISP pipeline can reliably preserve the low-amplitude rPPG signal required for physiological measurements. This validation is essential for an rPPG based algorithm to be deployed to the market on a wide range of consumer devices. In this work, we present a system for camera sensor and processing pipeline validation designed specifically for rPPG applications, enabling controlled and repeatable measurements of camera response under standardised conditions. Baseline measurements on a control device are used to assess system stability and reproducibility, and to derive quantitative signal similarity thresholds, which are subsequently applied to a range of consumer devices to illustrate inter-device variability and its implications for broadening rPPG deployment. By explicitly addressing sensor-level validation, this work provides a methodological approach for rPPG research and supports the development of deployable, accurate rPPG based vital sign applications across a wide range of consumer devices.

2. Materials and Methods

2.1. Camera Setup

All signals are captured using the same custom app used for rPPG signal collection and data analysis created by Lifelight. Within this application, the signal recording process relies on face recognition, as described in [12]. In short, the mid-face region brightness is spatially averaged into three 1D signals, one for each colour channel (red, green, and blue). In order for this app to work in the rig described above where no actual human face is present, an AI generated image of a face is used, as shown in Figure 1. The same image is used throughout all measurements, printed on matte paper to avoid complicated reflection patterns, and is illuminated using light emitting diodes (LEDs).
Camera settings such as the frame rate are fixed within the application, using a constant rate of 30 frames per second for rPPG capture. Additionally, automatic image processing functions such as white balance and exposure adjustment were disabled or fixed where supported by the device, enforcing consistent capture conditions. An uncompressed video format was used to avoid negatively impacting the rPPG signal quality and removing physiological variation from the signal [13].

2.2. Setup

The experimental setup to characterise the camera behaviour within the custom app was designed to provide a reproducible and controlled test environment. The system consists of a programmable illumination source driven by a microcontroller and a mechanically constrained imaging rig that enforces consistent camera alignment and geometry. Together, these components enable repeatable optical stimulation and standardised image acquisition across multiple devices.
An Arduino microcontroller controls a set of LEDs, allowing predefined waveforms to be emitted with high repeatability. LEDs provide controllable illumination with negligible warm-up time and minimal intensity drift over the timescales relevant to accurately capture the physiological fluctuations measured in rPPG signals. Their narrowband spectral characteristics and linear drive behaviour make them well suited for generating repeatable signals [14]. The illumination patterns were deterministic and identical across measurement sessions, ensuring that any observed variability in the recorded signals originated from the camera sensor and capture pipeline rather than from the signal itself.
To ensure consistency between measurements, a custom 3D rig as shown in Figure 1 is used. This rig constrains the relative positions of the camera, LEDs and target scene, minimising variation in measurements due to camera placement. The rig is placed in an enclosed, opaque box during measurements to avoid variability in light. On one side the device under test (DUT) is placed upside down, with the camera to be characterised facing the other side of the rig with the LEDs and an image of a face held within a fixed picture frame.
The LED brightness is updated every 15 milliseconds, corresponding to an update rate of approximately 67 Hz. This is more than double the frame rate of the camera capture rate, ensuring that temporal modulation is sufficiently sampled, while at the same time avoiding aliasing effects.

2.3. Camera Sensor Characterisation

Using controlled signals from the LEDs, different waveforms can be used for systematic evaluation of the camera’s response under repeatable conditions. In this manuscript we describe two different tests: one to assess the waveform morphology fidelity and one to assess frequency dependent behaviour across different operating conditions.
The first test uses predefined pulse-like waveforms, ensuring the frequency and amplitudes are representative of physiological pulse signals. The waveforms were constructed using sinusoidal components with fundamental frequencies varying between 1 and 2 Hz (equivalent to a pulse rate range of 60-120 beats per minute). Signal amplitudes were deliberately kept small, with peak-to-peak variations limited to two digital intensity units (i.e. roughly 2% of scene intensity) to reflect the low-amplitude nature of rPPG signals compared to overall scene brightness. The aim of this test was to examine the ability of the camera sensor and manufacturer’s proprietary ISP to preserve the shape of individual pulses, and to understand whether the morphology was affected by camera processing artifacts such as attenuation, temporal distortion or nonlinear effects. Examples of the predefined pulse waveforms are shown in Figure 2.
The second test uses a frequency sweep from 0.5 to 8Hz. This procedure is performed across a range of fixed brightness levels, to evaluate the consistency of the camera’s temporal response across different operating points, for example different exposure settings, gain settings or brightness-related image processing. By repeating the test at multiple brightness levels, the stability of the camera sensor response under conditions that may trigger changes in the camera’s behaviour is captured. Because vital sign measurements with Lifelight span 40 seconds, the sweep is designed to happen over 35 seconds, allowing a brightness change to happen before and after the sweep test starts. A graph showing the frequency sweep part of the signal with the corresponding frequency is shown in Figure 3.
To use a camera for rPPG signal capture it is important that the behaviour is not situation dependent. Combined, these tests enable evaluation of the camera’s suitability to be used to capture rPPG by analysing the robustness of the camera’s response across different frequencies and brightness levels.

2.4. Measurement Protocol

All measurements are taken using the rig shown in Figure 1 in an enclosed environment, avoiding interference of any light sources other than the light emitted by the LEDs in the setup.
For the waveform morphology test, signals were generated at three distinct PR frequencies, corresponding to physiologically plausible heart rates: 60bpm, 90bpm and 120bpm. These were chosen to span a range of resting and elevated heart rates, and most importantly to assess the difference when a different number of frames is recorded in a pulse to see whether the morphology is still maintained. For each PR multiple recordings were acquired under identical conditions to evaluate repeatability and assess variability in the captured pulse waveform morphology.
For the frequency sweep tests, signals were generated with a fixed modulation amplitude as shown in Figure 3, but the measurements were acquired across a predefined range of initial brightness settings. The available brightness range is constrained by the camera control implemented within Lifelight, enforcing specific exposure and brightness criteria prior to recording. Signal acquisition only starts when these conditions are satisfied. As a result, the set of brightness levels at which measurements can be started can vary between devices, reflecting the differences in camera hardware and exposure control. This constraint is necessary as it maintains consistent signal quality by reducing noise amplification, clipping and exposure related artifacts that could compromise the rPPG signal.
While the input wave is known and the measured signals of the device under test (DUT) could be compared against the known input waves, we have chosen to use a reference device (RD) for comparison instead. An iPad 8th Gen was selected as the RD due to its use in Lifelight’s previous data collection and algorithm development studies [15]. All baselines measured on the RD were repeated on three different identical models to characterise baseline variability and establish acceptance thresholds. It is particularly relevant in this use case for the camera of a DUT to have a consistent camera response relative to the RD to enable the use of rPPG algorithms trained on data acquired using the RD.

2.5. Signal Processing and Evaluation Metrics

2.5.1. Time Domain Morphology Comparison

Analysis of the pulse morphology was performed to assess the ability of the camera sensor and ISP to preserve the structure of pulse-like signals. Two criteria were used to assess the DUT’s suitability: the waveform similarity relative to the RD and the temporal consistency of the detected pulses.
Lifelight’s pulse detection algorithm was used to find individual pulse peaks and individual pulse durations [16]. Due to the setup’s stable signal frequency, the expected pulse duration is known based on the input PR. This expected duration was compared to the measured individual pulse durations in the recorded signal, providing a direct assessment of frame rate consistency and temporal stability, because deviations between expected and observed pulse durations indicate timing irregularities or distortion introduced by the ISP. Consistent frame timing is essential for a camera used as an rPPG sensor, as variations in frame rate could alter the apparent PR and compromise the reliability of morphology derived physiological features.
Morphological similarity of the pulse waveform was evaluated by comparing the individually found pulses against the reference waveform derived from the RD. For the reference waveform, all detected repeated pulses were aligned and averaged to form a representative waveform. each individual pulse from the DUT was compared to this reference using the fitness score (FS) [17]:
F S = 1 ( p R D p D U T ) 2 ( p R D p ¯ R D ) 2
where p R D is the reference waveform, p ¯ R D the average of the reference waveform and p D U T the individual pulse being compared against the reference. To set a threshold for what is deemed a tolerable variation in FS, the effects of quantisation imposed by the camera frame rate were considered. The reference waveform was used to generate a worst-case quantisation misalignment scenario, by using a temporal offset of half a frame (1/60th of a second in a 30fps capture rate) and a mismatch of two frames in pulse duration due to differences in located start and end of the waveform. The FS was computed between the reference waveform and the offset waveform, and the resulting value was used to set a lower-bound threshold for acceptable morphological similarity. FS values below the threshold indicate distortion introduced by the camera sensor or ISP that the rPPG algorithm might not be able to tolerate.

2.5.2. Frequency Response

The frequency response of the camera was evaluated using the frequency sweep test by analysing the amplitude of the recorded signal as a function of the applied modulation frequency. The consistency of the measured amplitude across frequencies provides an indication of the stability of the camera’s temporal response to varying input stimuli. In an ideal system, this response would be flat due to the fixed amplitude of the input wave; however, in practical ISP pipelines a decay is typically observed due to quantisation and the exposure duration of the sensor, which together act as a low-pass filter. Additionally, ISP pipelines incorporate temporal noise reduction filters, leading to a reduction in gain at higher frequencies. This is undesirable for rPPG extraction due to the information content found in the morphological detail correlating with vital signs such as BP [12,18,19,20,21]. It is complicated due to the variation between manufacturers and individual devices, and the algorithms are usually kept as proprietary information. Although the amplitude of the input modulation is held constant, the measured response may vary with the initial brightness level and exposure settings used during acquisition. To allow meaningful comparison across repeated measurements and different DUTs, the frequency response was therefore normalised with respect to the measured amplitude at 1 Hz.

3. Results

3.1. Baseline Measurements

Using the iPad 8 as RD, repeated measurements are taken for each pulse shape waveform at each of the three tested PRs. In Figure 4, the average created pulse for each pulse shape waveform is shown. It can be seen that the shape is repetitive, with the exception of variation due to quantisation. The average waveforms are used to calculate the thresholds for FS based on the worst case scenario time offset as described in Section 2.5.1.

3.1.1. Threshold Sett ing

Table 1 shows the thresholds for the different tested PR and pulse wave shape combinations tested. As expected from Equation (1), shorter pulses (i.e. pulses with a higher PR), have a lower realistically achievable FS.

3.1.2. Reproducibility Testing

To assess reproducibility, multiple recordings of each signal were taken under identical circumstances. For each recorded signal, the FS of the detected individual pulses against the reference waveform was calculated. Figure 5 shows the distribution of the FS for each measurement. The distribution for the higher PRs is wider, as expected, due to the increased relative uncertainty in pulse length due to quantisation.
Figure 6 shows the measured frequency response: measured attenuation as a function of frequency based on the sweep tests for the RD. It can be seen that at higher frequencies, the amplitude is reduced compared to the attenuation at lower frequencies. However, the response is consistent between different measurements, showing that the training data collected using the RD will have been collected with a stable sensor and ISP.

3.2. Characterisation and Comparison of Other Devices

Following characterisation of the RD, three additional devices were evaluated: the Samsung A33, iPhone XR and Pixel 10. Figure 7 shows the comparison of waveform morphology for all recorded pulses for the different phones within one of the test signals, indicating a good agreement between the reference waveform and the recorded morphology on the DUT. Results for both pulse shapes and all three PRs for each of the devices are shown in Table 2. For each set of pulses, the mean FS and standard deviation (SD) of all FS for that measurement is shown, demonstrating consistency as well as achievement of the FS thresholds set in Table 1. It can also be seen that one of the DUTs, the Samsung A33, has a much closer agreement with the RD than the other devices. While all three devices are in close agreement, the Samsung A33 shows a smaller variation and the closest overlap with the RD, despite being from a different brand.
Figure 8 shows the frequency responses measured for the three different DUTs front cameras, compared against the mean frequency response of the RD. Across the evaluated frequency range of 0.5-8Hz the responses show good consistency with the RD, in overall shape and the degree of attenuation observed at higher frequencies. This indicates that the DUTs demonstrate a comparable sensor response to the RD, despite the differences in hardware and ISP pipelines.

4. Discussion

4.1. Implications for rPPG Work

The proposed camera validation procedure has potentially important implications for rPPG research and algorithm deployment, particularly in the context of smart-device based home monitoring of vital signs. By explicitly characterising the camera sensor’s behaviour, rPPG based algorithms trained on data from the RD can be used with improved confidence on different consumer’s devices as long as the camera passes the validation protocol.

4.2. Limitations

Several limitations of the proposed approach should be acknowledged. While the validation method allows a wider range of devices to use an rPPG based application as a medical device, the procedure requires physical access to each device and does not currently have an approach to handle cameras that fail the acceptance criteria, limiting the applicability to all consumer devices. The mechanical setup is technically reproducible, but not necessarily scalable, and currently requires manual intervention for device placement, configuration of each test setup and the analysis.

4.3. Future Work

Future work will focus on improving the scalability of the camera sensor validation system. This could include streamlining the mechanical setup, automating the taking of multiple measurements and analysis, and expanding the range of test conditions to reflect more diverse real-world applications. Further investigation into camera sensor behaviour and ISPs might enable application-level optimisation allowing a greater proportion of devices to be validated without the need for hardware changes.
Additionally, a more detailed investigation into the Samsung A33’s similarity to the RD is warranted. Any given consumer smart device will eventually no longer receive updates and support. A high degree of similarity between an RD and a DUT offers the opportunity to further investigate how specific camera characteristics impact the sensor response, as well as the opportunity to replace the RD in future studies if the device has become unsupported.

5. Conclusions

This work presents a reproducible system for validation of the camera sensor response in context of rPPG. By focusing on sensor-level characterisation instead of rPPG algorithm performance alone, the proposed validation system addresses a gap in the generalisability of smart-device based rPPG algorithms for vital sign prediction such as BP and PR. This validation is particularly important for applications intended for medical software, where consistent signal acquisition across a wide range of consumer devices is essential.

6. Patents

A patent application has been filed related to the work described in this manuscript.

Author Contributions

Conceptualization, I.V., L.v.P and S.W.; methodology, L.v.P and S.W.; software, A.A, I.V and L.v.P. ; validation, I.V. and L.v.P.; formal analysis, I.V. and L.v.P.; investigation, A.A and A.M.; data curation, A.A., A.M., I.V. and L.v.P. ; writing—original draft preparation, L.v.P; writing—review and editing, all authors; visualization, L.v.P.; supervision, L.v.P. and S.W.; All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author(s).

Acknowledgments

The authors would like to thank the engineering team at Xim Ltd for their help with the camera algorithm development and data collection. We are also grateful to David Petronzio and Gauri Misra for their help in proofreading the final manuscript.

Conflicts of Interest

All authors are employed by Xim Ltd.

Abbreviations

The following abbreviations are used in this manuscript:
BP Blood Pressure
DUT Device Under Test
FS Fitness Score
ISP Image Signal Processing
LED Light Emitting Diodes
PPG Photoplethysmography
PR Pulse Rate
rPPG remote Photoplethysmography
RD Reference Device
SD Standard deviation

References

  1. Verkruysse, W.; Svaasand, L.O.; Nelson, J.S. Remote plethysmographic imaging using ambient light. Optics express 2008, 16, 21434–21445. [Google Scholar] [CrossRef] [PubMed]
  2. Kim, B.S.; Yoo, S.K. Motion artifact reduction in photoplethysmography using independent component analysis. IEEE transactions on biomedical engineering 2006, 53, 566–568. [Google Scholar] [CrossRef] [PubMed]
  3. Yao, J.; Warren, S. A short study to assess the potential of independent component analysis for motion artifact separation in wearable pulse oximeter signals. In Proceedings of the 2005 IEEE Engineering in Medicine and Biology 27th Annual Conference. IEEE, 2006; pp. 3585–3588. [Google Scholar]
  4. Shahmirzadi, D.; Rahmani, A.M.; Wang, W.F. Hybrid approach to heart rate estimation: comparing Green, CHROM and POS methods in rPPG analysis. Proceedings of the International Workshop on Advanced Imaging Technology (IWAIT) 2025. SPIE 2025, Vol. 13510, 77–81. [Google Scholar]
  5. Khaleel Sallam Ma’aitah, M.; Helwan, A. 3D DenseNet with temporal transition layer for heart rate estimation from real-life RGB videos. Technology and Health Care 2025, 33, 419–430. [CrossRef] [PubMed]
  6. Haugg, F.; Elgendi, M.; Menon, C. Effectiveness of remote PPG construction methods: a preliminary analysis. Bioengineering 2022, 9, 485. [Google Scholar] [CrossRef] [PubMed]
  7. Yu, Z.; Li, X.; Zhao, G. Facial-video-based physiological signal measurement: Recent advances and affective applications. IEEE Signal Processing Magazine 2021, 38, 50–58. [Google Scholar] [CrossRef]
  8. Lu, Y.; Wang, C.; Meng, M.Q.H. Video-based Contactless Blood Pressure Estimation: A Review. In Proceedings of the 2020 IEEE International Conference on Real-time Computing and Robotics (RCAR), 2020; pp. 62–67. [Google Scholar] [CrossRef]
  9. Mironenko, Y.; Kalinin, K.; Kopeliovich, M.; Petrushan, M. Remote photoplethysmography: Rarely considered factors. In Proceedings of the Proceedings of the IEEE/CVF conference on computer vision and pattern recognition workshops, 2020; pp. 296–297. [Google Scholar]
  10. Pointer, M.R.; Attridge, G.G.; Jacobson, R.E. Practical camera characterization for colour measurement. The Imaging Science Journal 2001, 49, 63–80. [Google Scholar] [CrossRef]
  11. Mullikin, J.C.; van Vliet, L.J.; Netten, H.; Boddeke, F.R.; Van der Feltz, G.; Young, I.T. Methods for CCD camera characterization. Proceedings of the Image Acquisition and Scientific Imaging Systems. Spie 1994, Vol. 2173, 73–84. [Google Scholar]
  12. van Putten, L.D.; Bamford, K.E.; Veleslavov, I.; Wegerif, S. From video to vital signs: using personal device cameras to measure pulse rate and predict blood pressure using explainable AI. Discover Applied Sciences 2024, 6, 184. [Google Scholar] [CrossRef]
  13. Wang, J.; Shan, C.; Liu, Z.; Zhou, S.; Shu, M. Physiological Information Preserving Video Compression for rPPG. IEEE Journal of Biomedical and Health Informatics 2025. [Google Scholar] [CrossRef] [PubMed]
  14. Procka, P.; Borik, S. System for contactless monitoring of tissue perfusion. In Proceedings of the 2022 ELEKTRO (ELEKTRO), 2022; IEEE; pp. 1–5. [Google Scholar]
  15. Wiffen, L.; Brown, T.; Maczka, A.B.; Kapoor, M.; Pearce, L.; Chauhan, M.; Chauhan, A.J.; Saxena, M.; Group, L.T.; et al. Measurement of vital signs by lifelight software in comparison to standard of care multisite development (VISION-MD): protocol for an observational study. JMIR Research Protocols 2023, 12, e41533. [Google Scholar] [CrossRef] [PubMed]
  16. van Putten, L.D.; Ahmed, A.; Wegerif, S. Remote photoplethysmography for contactless pulse rate monitoring: algorithm development and accuracy assessment. Physiological Measurement 2025, 46, 115004. [Google Scholar] [CrossRef] [PubMed]
  17. Zahedi, E.; Sohani, V.; Ali, M.M.; Chellappan, K.; Beng, G.K. Experimental feasibility study of estimation of the normalized central blood pressure waveform from radial photoplethysmogram. Journal of Healthcare Engineering 2015, 6, 121–144. [Google Scholar] [CrossRef] [PubMed]
  18. Takazawa, K. Clinical usefulness of the second derivative of a plethysmogram (acceleration plethysmogram). Journal of Cardiology 1993, 23, 207–217. [Google Scholar]
  19. Imanaga, I.; Hara, H.; Koyanagi, S.; Tanaka, K. Correlation between wave components of the second derivative of plethysmogram and arterial distensibility. Japanese heart journal 1998, 39, 775–784. [Google Scholar] [CrossRef] [PubMed]
  20. Kurylyak, Y.; Lamonaca, F.; Grimaldi, D. A Neural Network-based method for continuous blood pressure estimation from a PPG signal. In Proceedings of the 2013 IEEE International Instrumentation and Measurement Technology Conference (I2MTC), 2013; pp. 280–283. [Google Scholar] [CrossRef]
  21. Tigges, T.; Pielmuş, A.; Klum, M.; Feldheiser, A.; Hunsicker, O.; Orglmeister, R. Model selection for the Pulse Decomposition Analysis of fingertip photoplethysmograms. In Proceedings of the 2017 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), 2017; pp. 4014–4017. [Google Scholar] [CrossRef]
Figure 1. Photo of the 3D printed rig, showing the setup for device validation.
Figure 1. Photo of the 3D printed rig, showing the setup for device validation.
Preprints 195790 g001
Figure 2. Ten second snippets of two pulse-like waveforms used as known waveform input at three different pulse rates.
Figure 2. Ten second snippets of two pulse-like waveforms used as known waveform input at three different pulse rates.
Preprints 195790 g002
Figure 3. Graphs of the input signal (top) for the sweep test and the frequency as a function of time (bottom).
Figure 3. Graphs of the input signal (top) for the sweep test and the frequency as a function of time (bottom).
Preprints 195790 g003
Figure 4. Measured pulses for the different waveforms with the average shown in black.
Figure 4. Measured pulses for the different waveforms with the average shown in black.
Preprints 195790 g004
Figure 5. Density histograms for each of the reference waveforms of repeated measurements, showing a repetitive behaviour for each signal.
Figure 5. Density histograms for each of the reference waveforms of repeated measurements, showing a repetitive behaviour for each signal.
Preprints 195790 g005
Figure 6. Measured frequency responses for RD with the average calculated response.
Figure 6. Measured frequency responses for RD with the average calculated response.
Preprints 195790 g006
Figure 7. Comparison of individual pulses for different phones against the reference wave for pulse shape 1 with a PR of 60bpm.
Figure 7. Comparison of individual pulses for different phones against the reference wave for pulse shape 1 with a PR of 60bpm.
Preprints 195790 g007
Figure 8. Comparison of measured frequency responses for all 3 devices under test compared against the mean frequency response of the reference device.
Figure 8. Comparison of measured frequency responses for all 3 devices under test compared against the mean frequency response of the reference device.
Preprints 195790 g008
Table 1. Calculated fitness score thresholds for the different pulse rates and waveform shapes used.
Table 1. Calculated fitness score thresholds for the different pulse rates and waveform shapes used.
Pulse Rate (bpm) Pulse Wave Shape 1 Pulse Wave Shape 2
60 0.81 0.82
90 0.71 0.72
120 0.61 0.62
Table 2. Overview of fitness scores for the different phones and different test signals, reporting the mean and standard deviation of all calculated scores for each test signal.
Table 2. Overview of fitness scores for the different phones and different test signals, reporting the mean and standard deviation of all calculated scores for each test signal.
Pulse Rate (bpm) Pulse shape Samsung A33 iPhone XR Pixel 10
(mean ± SD) (mean ± SD) (mean ± SD)
60 1 0.94 ± 0.01 0.88 ± 0.05 0.89 ± 0.04
90 1 0.84 ± 0.01 0.86 ± 0.05 0.85 ± 0.06
120 1 0.86 ± 0.02 0.84 ± 0.03 0.84 ± 0.06
60 2 0.88 ± 0.03 0.86 ± 0.05 0.88 ± 0.04
90 2 0.91 ± 0.01 0.85 ± 0.04 0.84 ± 0.06
120 2 0.79 ± 0.01 0.74 ± 0.05 0.79 ± 0.06
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated