Submitted:
03 July 2025
Posted:
03 July 2025
You are already at the latest version
Abstract
Keywords:
1. Introduction
2. Materials and Methods
2.1. Datasets
2.2. Filtering Techniques
2.3. Dataset Splitting
2.4. DL Implementation
2.5. Training Loss Across Scenarios
2.6. Testing Methods
3. Results
3.1. Performance Across Scenarios: Training Phase
3.2. Performance Across Scenarios: Testing Phase
3.3. Detailed Performance of Scenario 3 (S3)
3.4. Inference Time per Video (Real-Time Detection)
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Appendix A



Appendix B



Appendix C



References
- Wang Y, Gargani L, Barskova T, Furst DE, Cerinic MM. Usefulness of lung ultrasound B-lines in connective tissue disease-associated interstitial lung disease: A literature review. Arthritis Res Ther. 2017;19(1):206–206.
- Jeganathan N, Corte TJ, Spagnolo P. Editorial: Epidemiology and risk factors for interstitial lung diseases. Front Med [Internet]. 2024 Mar 6 [cited 2024 Aug 21];11. Available from: https://www.frontiersin.org/journals/medicine/articles/10.3389/fmed.2024.1384825/full.
- Dietrich CF, Mathis G, Blaivas M, Volpicelli G, Seibel A, Wastl D, et al. Lung B-line artefacts and their use. J Thorac Dis [Internet]. 2016 Jun [cited 2023 Mar 2];8(6). Available from: https://jtd.amegroups.com/article/view/7571.
- Mento F, Khan U, Faita F, Smargiassi A, Inchingolo R, Perrone T, et al. State of the Art in Lung Ultrasound, Shifting from Qualitative to Quantitative Analyses. Ultrasound Med Biol [Internet]. 2022 Dec 1 [cited 2023 Mar 16];48(12):2398–416. Available from: https://www.sciencedirect.com/science/article/pii/S0301562922004823.
- Volpicelli G, Elbarbary M, Blaivas M, Lichtenstein DA, Mathis G, Kirkpatrick AW, et al. International evidence-based recommendations for point-of-care lung ultrasound. Intensive Care Med. 2012 Apr;38(4):577–91.
- Ziskin MC, Thickman DI, Goldenberg NJ, Lapayowker MS, Becker JM. The comet tail artifact. J Ultrasound Med [Internet]. 1982 [cited 2023 Sep 11];1(1):1–7. Available from: https://onlinelibrary.wiley.com/doi/abs/10.7863/jum.1982.1.1.1.
- Lichtenstein D, Mézière G, Biderman P, Gepner A, Barré O. The Comet-tail Artifact. Am J Respir Crit Care Med [Internet]. 1997 Nov [cited 2023 Sep 11];156(5):1640–6. Available from: https://www.atsjournals.org/doi/full/10.1164/ajrccm.156.5.96-07096.
- Volpicelli G, Lamorte A, Villén T. What’s new in lung ultrasound during the COVID-19 pandemic. Intensive Care Med [Internet]. 2020 Jul 1 [cited 2023 Sep 11];46(7):1445–8. Available from: . [CrossRef]
- Soldati G, Smargiassi A, Inchingolo R, Sher S, Nenna R, Valente S, et al. Lung Ultrasonography May Provide an Indirect Estimation of Lung Porosity and Airspace Geometry. Respiration [Internet]. 2014 Nov 5 [cited 2023 Oct 10];88(6):458–68. Available from: . [CrossRef]
- Volpicelli G, Fraccalini T, Cardinale L. Lung ultrasound: are we diagnosing too much? Ultrasound J [Internet]. 2023 Mar 29 [cited 2023 Apr 17];15(1):17. Available from: . [CrossRef]
- Dietrich CF, Mathis G, Blaivas M, Volpicelli G, Seibel A, Wastl D, et al. Lung B-line artefacts and their use. J Thorac Dis [Internet]. 2016 Jun [cited 2024 Oct 21];8(6). Available from: https://jtd.amegroups.org/article/view/7571.
- Marini TJ, Rubens DJ, Zhao YT, Weis J, O’connor TP, Novak WH, et al. Lung ultrasound: The essentials. Radiol Cardiothorac Imaging. 2021;3(2):e200564–e200564.
- Wang J, Yang X, Zhou B, Sohn JJ, Zhou J, Jacob JT, et al. Review of Machine Learning in Lung Ultrasound in COVID-19 Pandemic. J Imaging. 2022;8(3).
- Zhao L, Lediju Bell MA. A Review of Deep Learning Applications in Lung Ultrasound Imaging of COVID-19 Patients. BME Front. 2022 Feb 15;2022:9780173.
- Baloescu C, Rucki AA, Chen A, Zahiri M, Ghoshal G, Wang J, et al. Machine Learning Algorithm Detection of Confluent B-Lines. Ultrasound Med Biol. 2023;49(9):2095–102.
- Brusasco C, Santori G, Bruzzo E, Trò R, Robba C, Tavazzi G, et al. Quantitative lung ultrasonography: a putative new algorithm for automatic detection and quantification of B-lines. Crit Care [Internet]. 2019;23:null. Available from: https://www.semanticscholar.org/paper/ca058bdd544700759405f9da3136f510783222db.
- Erfanian Ebadi S, Krishnaswamy D, Bolouri SES, Zonoobi D, Greiner R, Meuser-Herr N, et al. Automated detection of pneumonia in lung ultrasound using deep video classification for COVID-19. Inform Med Unlocked [Internet]. 2021 Jan 1 [cited 2023 Jun 9];25:100687. Available from: https://www.sciencedirect.com/science/article/pii/S2352914821001714.
- Kulhare S, Zheng X, Mehanian C, Gregory C, Zhu M, Gregory K, et al. Ultrasound-Based Detection of Lung Abnormalities Using Single Shot Detection Convolutional Neural Networks. 2018; Available from: https://www.semanticscholar.org/paper/fb19ad31431cf68a75a20bd209d05162a3976c8c.
- Liu RB, Tayal VS, Panebianco NL, Tung-Chen Y, Nagdev A, Shah S, et al. Ultrasound on the Frontlines of COVID-19: Report From an International Webinar. Acad Emerg Med. 2020;27(6):523–6.
- Lucassen R, Jafari M, Duggan N, Jowkar N, Mehrtash A, Fischetti C, et al. Deep Learning for Detection and Localization of B-Lines in Lung Ultrasound. 2023; Available from: https://www.semanticscholar.org/paper/6489a6f15db3cb06388050bafe3d606835931f79.
- Pare JR, Gjesteby LA, Telfer BA, Tonelli MM, Leo MM, Billatos E, et al. Transfer Learning for Automated COVID-19 B-Line Classification in Lung Ultrasound. In: 2022 44th Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC). 2022. p. 1675–81.
- Wang Y, Zhang Y, He Q, Liao H, Luo J. Quantitative Analysis of Pleural Line and B-Lines in Lung Ultrasound Images for Severity Assessment of COVID-19 Pneumonia. IEEE Trans Ultrason Ferroelectr Freq Control. 2022;69(1):73–83.
- Arntfield R, VanBerlo B, Alaifan T, Phelps N, White M, Chaudhary R, et al. Development of a convolutional neural network to differentiate among the etiology of similar appearing pathological B lines on lung ultrasound: a deep learning study. BMJ Open. 2021;11(3):e045120.
- Perera S, Adhikari S, Yilmaz A. Pocformer: A Lightweight Transformer Architecture For Detection Of Covid-19 Using Point Of Care Ultrasound. In IEEE; 2021. p. 195–9. (IEEE International Conference on Image Processing ICIP; vols 2021-).
- Hu Z, Nasute Fauerbach PV, Yeung C, Ungi T, Rudan J, Engel CJ, et al. Real-time automatic tumor segmentation for ultrasound-guided breast-conserving surgery navigation. Int J Comput Assist Radiol Surg. 2022 Sep;17(9):1663–72.
- Automated and real-time segmentation of suspicious breast masses using convolutional neural network | PLOS One [Internet]. [cited 2025 Apr 10]. Available from: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0195816.
- Wei Y, Yang B, Wei L, Xue J, Zhu Y, Li J, et al. Real-time carotid plaque recognition from dynamic ultrasound videos based on artificial neural network. Ultraschall Med - Eur J Ultrasound [Internet]. 2024 Oct [cited 2025 Apr 10];45(5):493–500. Available from: http://www.thieme-connect.de/DOI/DOI?10.1055/a-2180-8405.
- Nurmaini S, Nova R, Sapitri AI, Rachmatullah MN, Tutuko B, Firdaus F, et al. A Real-Time End-to-End Framework with a Stacked Model Using Ultrasound Video for Cardiac Septal Defect Decision-Making. J Imaging [Internet]. 2024 Nov [cited 2025 Apr 10];10(11):280. Available from: https://www.mdpi.com/2313-433X/10/11/280.
- Zhang TT, Shu H, Tang ZR, Lam KY, Chow CY, Chen XJ, et al. Weakly supervised real-time instance segmentation for ultrasound images of median nerves. Comput Biol Med. 2023 Aug;162:107057.
- Ou Z, Bai J, Chen Z, Lu Y, Wang H, Long S, et al. RTSeg-net: A lightweight network for real-time segmentation of fetal head and pubic symphysis from intrapartum ultrasound images. Comput Biol Med. 2024 Jun;175:108501.
- Khan U, Afrakhteh S, Mento F, Mert G, Smargiassi A, Inchingolo R, et al. Low-complexity lung ultrasound video scoring by means of intensity projection-based video compression. Comput Biol Med [Internet]. 2024 Feb 1 [cited 2024 Oct 8];169:107885. Available from: https://www.sciencedirect.com/science/article/pii/S0010482523013501.
- Dong X, Bao J, Chen D, Zhang W, Yu N, Yuan L, et al. CSWin Transformer: A General Vision Transformer Backbone with Cross-Shaped Windows. 2021.
- Al-hammuri K, Gebali F, Kanan A, Chelvan IT. Vision transformer architecture and applications in digital health: a tutorial and survey. Vis Comput Ind Biomed Art [Internet]. 2023 Jul 10 [cited 2024 Aug 21];6(1):14. Available from: . [CrossRef]
- He K, Gan C, Li Z, Rekik I, Yin Z, Ji W, et al. Transformers in Medical Image Analysis: A Review. Medical Image Analysis. 2022;76:102445.
- Li J, Chen J, Tang Y, Wang C, Landman BA, Zhou SK. Transforming medical imaging with Transformers? A comparative review of key properties, current progresses, and future perspectives. Med Image Anal [Internet]. 2023 Apr [cited 2024 Aug 21];85:102762. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10010286/.
- Vafaeezadeh M, Behnam H, Gifani P. Ultrasound Image Analysis with Vision Transformers—Review. Diagnostics [Internet]. 2024 Mar 4 [cited 2024 Aug 21];14(5):542. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10931322/.
- Watanabe S, Yomono K, Yamamoto S, Suzuki M, Gono T, Kuwana M. Lung ultrasound in the assessment of interstitial lung disease in patients with connective tissue disease: Performance in comparison with high-resolution computed tomography. Mod Rheumatol [Internet]. 2025 Jan 1 [cited 2025 May 6];35(1):79–87. Available from: . [CrossRef]
- Chen J, Frey EC, He Y, Segars WP, Li Y, Du Y. TransMorph: Transformer for unsupervised medical image registration. Med Image Anal [Internet]. 2022 Nov 1 [cited 2024 Oct 30];82:102615. Available from: https://www.sciencedirect.com/science/article/pii/S1361841522002432.
- Virtanen P, Gommers R, Oliphant TE, Haberland M, Reddy T, Cournapeau D, et al. SciPy 1.0: fundamental algorithms for scientific computing in Python. Nat Methods. 2020;17(3):261–72.
- Statsmodels: Econometric and Statistical Modeling with Python. In: Proceedings of the 9th Python in Science Conference.
- Goffi A, Kruisselbrink R, Volpicelli G. The sound of air: point-of-care lung ultrasound in perioperative medicine. Can J Anesth Can Anesth [Internet]. 2018 Apr [cited 2025 Apr 7];65(4):399–416. Available from: http://link.springer.com/10.1007/s12630-018-1062-x.
- Smargiassi A, Zanforlin A, Perrone T, Buonsenso D, Torri E, Limoli G, et al. Vertical Artifacts as Lung Ultrasound Signs: Trick or Trap? Part 2- An Accademia di Ecografia Toracica Position Paper on B-Lines and Sonographic Interstitial Syndrome. J Ultrasound Med [Internet]. 2023 [cited 2023 Sep 6];42(2):279–92. Available from: https://onlinelibrary.wiley.com/doi/abs/10.1002/jum.16116.















| Healthy Patients (H) | Non-healthy Patients (NH) | Training Set | Validation Set | Testing Set (Unseen) | Total | |||
| % | no | % | no | % | no | Videos/ Patients | ||
| 33 | 39 | ≈ 78% | 56 (26H + 26NH) |
≈ 6 % | 4 (2H + 2NH) |
≈ 17 % | 12 (3H +9 NH) |
72 |
| Scenario | Mean Accuracy |
95% Confidence Interval |
Compared To |
Mean Difference |
p-value | Cohen’s d | Effect Size Interpretation |
| Scenario 1 | 0.577 | [0.569, 0.584] | S2 | 0.127 | *** | 9.69 | Extremely large |
| Scenario 2 | 0.704 | [0.693, 0.715] | S3 | 0.104 | *** | 8.54 | Extremely large |
| Scenario 3 | 0.808 | [0.802, 0.814] | S1 | 0.231 | *** | 24.10 | Extremely large |
| Performance metrics | ||||||
| Accuracy | Specificity | Precision | Recall | F1-score | ||
| Scenario 1 (S1) | 50% | 20% | 56% | 71% | 63% | |
| Scenario 2 (S2) | 92% | 100% | 100% | 90% | 95% | |
| Scenario 3 (S3) | 92% | 100% | 100% | 90% | 95% | |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).