Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Use of Deep Learning to Detect the Maternal Heart Rate and False Signals on Fetal Heart Rate Recordings

Version 1 : Received: 7 July 2022 / Approved: 8 July 2022 / Online: 8 July 2022 (05:01:43 CEST)
Version 2 : Received: 21 July 2022 / Approved: 22 July 2022 / Online: 22 July 2022 (03:08:59 CEST)

A peer-reviewed article of this Preprint also exists.

Boudet, S.; Houzé de l’Aulnoit, A.; Peyrodie, L.; Demailly, R.; Houzé de l’Aulnoit, D. Use of Deep Learning to Detect the Maternal Heart Rate and False Signals on Fetal Heart Rate Recordings. Biosensors 2022, 12, 691, doi:10.3390/bios12090691. Boudet, S.; Houzé de l’Aulnoit, A.; Peyrodie, L.; Demailly, R.; Houzé de l’Aulnoit, D. Use of Deep Learning to Detect the Maternal Heart Rate and False Signals on Fetal Heart Rate Recordings. Biosensors 2022, 12, 691, doi:10.3390/bios12090691.

Abstract

We have developed deep learning models for automatic identification of the maternal heart rate (MHR) and, more generally, false signals (FSs) on fetal heart rate (FHR) recordings. The models can be used to preprocess FHR data prior to automated analysis or as a clinical alert system to assist the practitioner. Three models were developed and used to detect (i) FSs on the MHR channel (the FSMHR model), (ii) the MHR and FSs on the Doppler FHR sensor (the FSDop model), and (iii) FSs on the scalp ECG channel (the FSScalp model). The FSDop model was the most important because FSs are far more frequent on the Doppler FHR channel. All three models were based on a multilayer symmetric gated recurrent unit and were trained on data recorded during the first and second stages of delivery. The FSMHR and FSDop models were also trained on antepartum recordings. The training dataset contained 1030 expert-annotated periods (mean duration: 36 min) from 635 recordings. In an initial evaluation of routine clinical practice, 30 fully annotated recordings for each sensor type (mean duration: 5 h for MHR and Doppler sensors, and 3 h for the scalp ECG sensor) were analyzed.The sensitivity, positive predictive value (PPV) and accuracy were respectively 62.20%, 87.1% and 99.90% for the FSMHR model, 93.1%, 95.6% and 99.68% for the FSDop model, and the 44.6%, 87.2% and 99.93% for the FSScalp model. We built a second test dataset with a more solid ground truth by selecting 45 periods (lasting 20 min, on average) on which the Doppler FHR and scalp ECG signals were recorded simultaneously. Using the scalp ECG data, the experts estimate the true FHR value more reliably and thus annotated the Doppler FHR channel more precisely. The models achieved a sensitivity of 53.3%, a PPV of 62.4%, and an accuracy of 97.29%. In comparison, two experts (blinded to the scalp ECG data) achieved a sensitivity of 15.7%, a PPV of 74.3%, and an accuracy of 96.91% for expert 1 and a sensitivity of 60.7%, a PPV of 83.5% and an accuracy of 98.24% for expert 2; hence, the model performed better than one expert and worse than the other. Hence, the models performed at expert level, although a well-trained expert with good knowledge of FSs could probably do better in some cases. The models and datasets have been included in the Fetal Heart Rate Morphological Analysis open source MATLAB toolbox and can be used freely for research purposes.

Supplementary and Associated Material

https://github.com/utsb-fmm/FHRMA: Source-code and datasets

Keywords

fetal heart rate; maternal heart rate; cardiotocogram; gated recurrent unit; deep learning

Subject

Computer Science and Mathematics, Computer Science

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.