REVIEW | doi:10.20944/preprints202105.0056.v2
Subject: Medicine & Pharmacology, General Medical Research Keywords: artificial intelligence; machine learning; deep learning; neural networks; biomedicine; healthcare; medicine; literature; PubMed; Embase
Online: 9 June 2021 (11:23:48 CEST)
Artificial Intelligence (AI) refers to the simulation of human intelligence in machines, using machine learning, deep learning and neural networks. AI enables machines to learn from experience and perform human-like tasks. The field of AI research has been developing fast over the past five to ten years, due to the rise of ‘big data’ and increasing computing power. In the medical area, AI can be used to improve diagnosis, prognosis, treatment, surgery, drug discovery, or for other applications. Therefore, both academia and industry are investing a lot in AI. This review investigates the biomedical literature (in the PubMed and Embase databases) by looking at bibliographical data, observing trends over time and occurrences of keywords. Some observations are made: AI has been growing exponentially over the past few years; it is used mostly for diagnosis; COVID-19 is already in the top-5 of diseases studied using AI; the United States, China, United Kingdom, South Korea and Canada are publishing the most articles in AI research; MIT is the world’s leading university in AI research; and convolutional neural networks are by far the most popular deep learning algorithms at this moment. These trends could be studied in more detail, by studying more literature databases or by including patent databases. More advanced analyses could be used to predict in which direction AI will develop over the coming years. The expectation is that AI will keep on growing, in spite of stricter privacy laws, more need for standardization, bias in the data, and the need for building trust.
REVIEW | doi:10.20944/preprints202003.0141.v1
Subject: Medicine & Pharmacology, General Medical Research Keywords: data sharing; data management; data science; big data; healthcare
Online: 8 March 2020 (16:46:20 CET)
In recent years, more and more health data are being generated. These data come not only from professional health systems, but also from wearable devices. All these data combined form ‘big data’ that can be utilized to optimize treatments for each unique patient (‘precision medicine’). To achieve this precision medicine, it is necessary that hospitals, academia and industry work together to bridge the ‘valley of death’ of translational medicine. However, hospitals and academia often have problems with sharing their data, even though the patient is actually the owner of his/her own health data, and the sharing of data is associated with increased citation rate. Academic hospitals usually invest a lot of time in setting up clinical trials and collecting data, and want to be the first ones to publish papers on this data. The idea that society benefits the most if the patient’s data are shared as soon as possible so that other researchers can work with it, has not taken root yet. There are some publicly available datasets, but these are usually only shared after studies are finished and/or publications have been written based on the data, which means a severe delay of months or even years before others can use the data for analysis. One solution is to incentivize the hospitals to share their data with (other) academic institutes and the industry. Here we discuss several aspects of data sharing in the medical domain: publisher requirements, data ownership, support for data sharing, data sharing initiatives and how the use of federated data might be a solution. We also discuss some potential future developments around data sharing.