Artificial Intelligence (AI) and Machine Learning

Reminder: Please bear in mind that these are early stage research which have not gone through a rigorous peer review process, and should not be regarded as conclusive clinical guidance or be reported in news media as established fact.

Explore the Artificial Intelligence (AI) and Machine Learning collection on Preprints.org for a snapshot of the latest research, spanning from novel algorithms to transformative applications, showcasing visionary research shaping the future of AI.

Sort by

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Yusuff Giwa

,

Taiwo Akinmuyisitan

Abstract: Digital twins are virtual representations of physical assets that remain continuously synchronized with their real-world counterparts. This paper focuses on \emph{adaptive digital twins}—models that incorporate online learning mechanisms to update their states and parameters in real time. We present a rigorous mathematical framework for adaptive digital twins, including formal definitions and data-driven update mechanisms. A theoretical convergence analysis establishes conditions under which the twin's state asymptotically tracks the physical system. We further explore theoretical performance limits, analyzing the effects of model uncertainties, noise, and computational constraints on accuracy. The paper follows academic best practices, providing detailed pseudocode for the adaptation algorithm, convergence proofs via Lyapunov stability theory, and supporting figures and tables. Our results show that, with proper design, adaptive digital twins can achieve stable and provably convergent tracking, ensuring reliable deployment in complex cyber-physical systems.
Review
Medicine and Pharmacology
Other

Quoc-Viet Pham

,

Dinh C. Nguyen

,

Thien Huynh-The

,

Won-Joo Hwang

,

Pubudu N. Pathirana

Abstract: The very first infected novel coronavirus case (COVID-19) was found in Hubei, China in Dec. 2019. The COVID-19 pandemic has spread over 215 countries and areas in the world, and has significantly affected every aspect of our daily lives. At the time of writing this article, the numbers of infected cases and deaths still increase significantly and have no sign of a well-controlled situation, e.g., as of 14 April 2020, a cumulative total of 1,853,265 (118,854) infected (dead) COVID-19 cases were reported in the world. Motivated by recent advances and applications of artificial intelligence (AI) and big data in various areas, this paper aims at emphasizing their importance in responding to the COVID-19 outbreak and preventing the severe effects of the COVID-19 pandemic. We firstly present an overview of AI and big data, then identify their applications in fighting against COVID-19, next highlight challenges and issues associated with state-of-the-art solutions, and finally come up with recommendations for the communications to effectively control the COVID-19 situation. It is expected that this paper provides researchers and communities with new insights into the ways AI and big data improve the COVID-19 situation, and drives further studies in stopping the COVID-19 outbreak.
Review
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Matthew T.J. Halma

,

Cristof Plothe

,

Theresa Lawrie

Abstract: In the wake of the Covid-19 crisis, a need has arisen to prevent and treat two related conditions, Covid vaccine injury and long Covid, both of which have a significant vascular component. Therefore, the management of these conditions require the development of strategies to prevent or dissolve blood clots and restore circulatory health. This review summarizes the evidence on strategies that can be applied to treat both long and vaccine injuries based on similar mechanisms of action.
Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Shadman Sakib

,

Nazib Ahmed

,

Ahmed Jawad Kabir

,

Hridon Ahmed

Abstract: With the increase of the Artificial Neural Network (ANN), machine learning has taken a forceful twist in recent times. One of the most spectacular kinds of ANN design is the Convolutional Neural Network (CNN). The Convolutional Neural Network (CNN) is a technology that mixes artificial neural networks and up to date deep learning strategies. In deep learning, Convolutional Neural Network is at the center of spectacular advances. This artificial neural network has been applied to several image recognition tasks for decades and attracted the eye of the researchers of the many countries in recent years as the CNN has shown promising performances in several computer vision and machine learning tasks. This paper describes the underlying architecture and various applications of Convolutional Neural Network.
Brief Report
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Anis Koubaa

Abstract: As the echoes of ChatGPT's remarkable success continue to permeate the AI community, its formidable successor, GPT-4, has emerged, showing off a wealth of novel features. In this concise paper, we elucidate the capabilities of GPT-4 and conduct a comparative analysis with its predecessor, ChatGPT, offering insights into their relative strengths and advancements in the rapidly evolving field of generative AI. We also present a comprehensive summary of the major performance results reported by OpenAI on GPT-4 across various Natural Language Processing (NLP) tasks. We place great emphasis on the innovative advancements offered by GPT-4 in comparison to its predecessors. Our focus is on highlighting its remarkable performance while also mentioning its limitations. The purpose of this paper is to deliver a succinct understanding of the new features and performance benchmarks of GPT-4.
Article
Engineering
Energy and Fuel Technology

Amir Mosavi

,

Abdullah Bahmani

Abstract: Machine learning (ML) methods has recently contributed very well in the advancement of the prediction models used for energy consumption. Such models highly improve the accuracy, robustness, and precision and the generalization ability of the conventional time series forecasting tools. This article reviews the state of the art of machine learning models used in the general application of energy consumption. Through a novel search and taxonomy the most relevant literature in the field are classified according to the ML modeling technique, energy type, perdition type, and the application area. A comprehensive review of the literature identifies the major ML methods, their application and a discussion on the evaluation of their effectiveness in energy consumption prediction. This paper further makes a conclusion on the trend and the effectiveness of the ML models. As the result, this research reports an outstanding rise in the accuracy and an ever increasing performance of the prediction technologies using the novel hybrid and ensemble prediction models.
Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Konstantine Arkoudas

Abstract: GPT-4 was released in March 2023 to wide acclaim, marking a very substantial improvement across the board over GPT-3.5 (OpenAI's previously best model, which had powered the initial release of ChatGPT). Despite the genuinely impressive improvement, however, there are good reasons to be highly skeptical of GPT-4's ability to reason. This position paper discusses the nature of reasoning; criticizes the current formulation of reasoning problems in the NLP community and the way in which the reasoning performance of LLMs is currently evaluated; introduces a collection of 21 diverse reasoning problems; and performs a detailed qualitative analysis of GPT-4's performance on these problems. Based on the results of this analysis, the paper argues that, despite the occasional flashes of analytical brilliance, GPT-4 at present is utterly incapable of reasoning.
Review
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Shagan Sah

Abstract: In this paper, various machine learning techniques are discussed. These algorithms are used for many applications which include data classification, prediction, or pattern recognition. The primary goal of machine learning is to automate human assistance by training an algorithm on relevant data. This paper should also serve as a collection of various machine learning terminology for easy reference.
Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Aditi Singh

,

Abul Ehtesham

,

Saket Kumar

,

Tala Talaei Khoei

Abstract: The Model Context Protocol (MCP), recently introduced by Anthropic, proposes a standardized framework for integrating large language models (LLMs) with dynamic external systems. This survey reviews the foundational architecture of MCP, including its client-server model, standardized messaging protocols, dynamic tool discovery, and security mechanisms, against the backdrop of current, fragmented API solutions. Although the protocol promises enhanced interoperability and scalability for Agentic AI systems, data supporting its long-term performance remains limited. MCP’s design is critically evaluated, potential applications in domains such as finance, healthcare, and customer service are discussed, and the key challenges are outlined. This work aims to inform researchers and practitioners of MCP’s potential benefits and its present limitations in the evolving AI integration landscape. The GitHub link for this survey is: Model-Context-Protocol-Survey.
Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Anurag Sinha

Abstract: This is break direction on oxygen sources and conveyance methodologies for COVID-19 treatment. It has been adjusted from WHO and UNICEF's specialized details and direction for oxygen treatment gadgets, which is important for the WHO clinical gadget specialized arrangement, 1 and depends on current information on the circumstance in China and different nations where cases have been distinguished. This direction is proposed for wellbeing office chairmen, clinical leaders, acquisition officials, arranging officials, biomedical architects, foundation engineers and strategy producers. It portrays how to: measure oxygen interest, to distinguish oxygen sources that are accessible, and select suitable flood sources to best react to COVID-19 patients' requirements, particularly in low-and-center pay nations. WHO will refresh these suggestions as new data opens up. Coronavirus pandemic spurred fake interest for oxygen gas chambers for clinical use - both at emergency clinics and inquisitively, for home use by patients. A few patients and surprisingly sound people investigate the conceivable outcomes and likely benefits of utilizing oxygen from chambers for private utilization. Be that as it may, this isn't continuously protected, and sufficient safety measures are to be taken, bombing which there can be fatalities. This paper investigates the significance of keeping up satisfactory degrees of oxygen levels appropriate for human utilization. It advises the clinical use and the advantages and disadvantages of putting away oxygen chambers at home. The investigation likewise addresses lawful and administrative perspectives. The investigation's discoveries can help people settle on an educated choice on the protected use regarding oxygen gas. Further, it cautions on the expanded significance of guidelines and limiting access and use. This paper aims at designing an oxygen level monitoring technique in an oxygen cylinder. The amount of oxygen present inside the oxygen cylinder is very vital information when such cylinder is in use for supply of oxygen to a critical patient. The amount of oxygen present inside the cylinder is measured by the pressure at the outlet nozzle. The pressure is measured using a high precision MEMS Pressure Sensor. The output of the MEMS pressure sensor is voltage of the order milli. An amplifier is used to amplify this milli volt signal. A microcontroller is used in cascade to process the signal and display the pressure of oxygen cylinder.
Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Prafull Sharma

,

Yingbo Li

Abstract: In this paper we propose a novel self-supervised approach of keywords and keyphrases retrieval and extraction by an end-to-end deep learning approach, which is trained by contextually self-labelled corpus. Our proposed approach is novel to use contextual and semantic features to extract the keywords and has outperformed the state of the art. Through the experiment the proposed approach has been proved to be better in both semantic meaning and quality than the existing popular algorithms of keyword extraction. In addition, we propose to use contextual features from bidirectional transformers to automatically label short-sentence corpus with keywords and keyphrases to build the ground truth. This process avoids the human time to label the keywords and do not need any prior knowledge. To the best of our knowledge, our published dataset in this paper is a fine domain-independent corpus of short sentences with labelled keywords and keyphrases in the NLP community.
Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Dr. Robert Campbell

,

Dr. Whitfield Diffie

,

Charles Robinson

Abstract: The rapid advancements and the merging of hybrid quantum-classical computing, artificial intelligence (AI), machine learning (ML), and deep learning (DL) pose a potentially unseen and significant threat to encryption that may impact Post-Quantum Cryptography (PQC) transition timelines. There is a direct hybrid quantum-classical computing threat on cryptography, and there is also a direct threat AI/ML on cryptography. However, the synergistic combination of these technologies presents known and unknown threats that need attention, focus action, and research. This paper reviews Grover's Adaptive Search (GAS), which combines Grover's Algorithm with adaptive techniques to optimize search further, potentially making it even more efficient for attacking encryption. This work also examines the quantum-accelerated Harrow-Hassidim-Lloyd (HHL) Algorithm, designed to solve systems of linear equations exponentially faster than classical algorithms in certain conditions. The HHL algorithm can solve some lattice-based problems, which have implications for lattice-based encryption. This technological confluence and its potential impact on cryptography and encryption necessitate a proactive and coordinated approach to developing and implementing quantum-resistant AI/ML cryptographic solutions. This paper reviews the technological confluence and its potential implications for classical cryptography and PQC transition timelines and calls for further research.
Review
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Anis Koubaa

,

Wadii Boulila

,

Lahouari Ghouti

,

Ayyub Alzahem

,

Shahid Latif

Abstract: ChatGPT, a groundbreaking natural language processing technology released just three months ago, has attracted significant attention due to its remarkable capabilities. This AI milestone has urged researchers, industry, decision-makers, and governments to examine this technology, including its implications, threats, and benefits. Despite the short period since its release, several researchers have examined ChatGPT from different perspectives. This paper presents a comprehensive review of ChatGPT, highlighting its technical novelties compared to previous models and analyzing existing research from various perspectives. We followed a rigorous methodology to conduct a critical review of existing research on ChatGPT and developed a taxonomy for the different areas of study. Additionally, we identify future challenges and research trends associated with ChatGPT. Our paper is the first critical review of ChatGPT literature, providing valuable insights for practitioners and policymakers. This paper serves as a reference for researchers seeking to advance research on ChatGPT, including its applications and development.
Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Weiwei Cai

,

Zhanguo Wei

Abstract: The latest methods based on deep learning have achieved amazing results regarding the complex work of inpainting large missing areas in an image. This type of method generally attempts to generate one single "optimal" inpainting result, ignoring many other plausible results. However, considering the uncertainty of the inpainting task, one sole result can hardly be regarded as a desired regeneration of the missing area. In view of this weakness, which is related to the design of the previous algorithms, we propose a novel deep generative model equipped with a brand new style extractor which can extract the style noise (a latent vector) from the ground truth image. Once obtained, the extracted style noise and the ground truth image are both input into the generator. We also craft a consistency loss that guides the generated image to approximate the ground truth. Meanwhile, the same extractor captures the style noise from the generated image, which is forced to approach the input noise according to the consistency loss. After iterations, our generator is able to learn the styles corresponding to multiple sets of noise. The proposed model can generate a (sufficiently large) number of inpainting results consistent with the context semantics of the image. Moreover, we check the effectiveness of our model on three databases, i.e., CelebA, Agricultural Disease, and MauFlex. Compared to state-of-the-art inpainting methods, this model is able to offer desirable inpainting results with both a better quality and higher diversity. The code and model will be made available on https://github.com/vivitsai/SEGAN.
Review
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Abhijit Dasgupta

,

Abhisek Bakshi

,

Srijani Mukherjee

,

Kuntal Das

,

Soumyajeet Talukdar

,

Pratyayee Chatterjee

,

Sagnik Mondal

,

Puspita Das

,

Subhrojit Ghosh

,

Archisman Som

+11 authors

Abstract: World is now experiencing a major health calamity due to the coronavirus disease (COVID-19) pandemic, caused by the severe acute respiratory syndrome coronavirus clade 2 (SARS-CoV- 2). The foremost challenge facing the scientific community is to explore the growth and transmission capability of the virus. Use of artificial intelligence (AI), such as, deep learning, in (i) rapid disease detection from x-ray/computerized tomography (CT)/ high-resolution computed tomography (HRCT) images, (ii) accurate prediction of the epidemic patterns and their saturation throughout the globe, (iii) identification of the epicenter in each country/state and forecasting the disease from social networking data, (iv) prediction of drug-protein interactions for repurposing the drugs, and (v) socio-economic impact and prediction of future relapses, has attracted much attention. In the present manuscript, we describe the role of various AI-based technologies for rapid and efficient detection from CT images complementing quantitative real time polymerase chain reaction (qRT-PCR) and immunodiagnostic assays. AI-based technologies to anticipate the current pandemic pattern, possibility of future relapses and socio-economic impact are also discussed. We inspect how the virus transmits depending on different factors, such as, population density and mobility among others. We depict how AI-based mobile app for contact tracing and surveys can prevent the transmission. A modified deep learning technique can assess affinity of the most probable drugs to treat COVID-19. Here a few effective antiviral drugs, such as, Geneticin, Avermectin B1, and Ancriviroc among others, have been reported with their appropriate validation from previous investigations.
Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Ubio Obu

,

Shruti Thakur

Abstract: The Covid-19 also known as the Coronavirus is a viral disease from the SARS-CoV-2 family of virus, as at December 2019 the first case of this virus infection was identified at Wuhan, China, this seemingly isolated case soon became a global pandemic, whose effect was felt globally which also had colossal effects on both health, economic and politics . As at the time of this research about 4.5 million people have died of the Coronavirus and over 215 million people already infected by it. This pandemic stood out not just for its scale but for how social media was a major contribution to its spread as well as to curbing it. The power of social media was used to spread misinformation as well as to spread awareness on the subject, with both having massive impact on the people. In this paper we will be running a sentimental analysis on twitter under the keyword “Covid-19 and Coronavirus”, twitter is a powerful social media tool that is known for its ability to keep trends in the form of tweets, we will be drawing correlations between the peaks of tweet with the peak of infection. We will also be analyzing to know what the impact of these tweets are having on the rate of the infection and vice versa. We will also be analyzing what people are tweeting most about, what are the talking points, comparing both real time and past tweets with real time infection and death rates using deep different learning methods to access what information can be derived from it.
Article
Biology and Life Sciences
Biochemistry and Molecular Biology

Jean Claude Perez

Abstract: The discovery of a simple numerical formula for the projection of all the atomic mass of life-sustaining CONHSP bioatoms leads to the emergence of a set of Nested CODES unifying all the biological, genetic and genomic components by unifying them from bioatoms up to 'to whole genomes. In particular, we demonstrate the existence of a digital meta-code common to the three languages ​​of biology that are RNA, DNA and amino acid sequences. Through this meta-code, genomic and proteomic images appear almost analogous and correlated. The analysis of the textures of these images then reveals a binary code as well as an undulatory code whose analysis on the human genome makes it possible to predict the alternating bands constituting the cariotypes of the chromosomes. The application of these codes to perspectives in astrobiology, in Cancers basic research and the emergence of binary codes and regions of local stability (voting process), whose fractal nature we demonstrate, is illustrated.
Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Konstantinos I. Roumeliotis

,

Nikolaos D. Tselikas

,

Dimitrios K. Nasiopoulos

Abstract: The rapidly evolving field of artificial intelligence (AI) continues to witness the introduction of innovative open-source pre-trained models, fostering advancements in various applications. One such model is Llama 2, an open-source pre-trained model released by Meta, which has garnered significant attention among early adopters. In addition to exploring the foundational elements of the Llama v2 model, this paper investigates how these early adopters leverage the capabilities of Llama 2 in their AI projects. Through a qualitative study, we delve into the perspectives, experiences, and strategies employed by early adopters to leverage Llama 2's capabilities. For the purpose of data analysis, the capabilities inherent in the Llama 2 model were employed to conduct keyword extraction from the context of the early adopters' case studies. The findings shed light on the model's strengths, weaknesses, and areas of improvement, offering valuable insights for the AI community and Meta to enhance future model iterations. Additionally, we discuss the implications of Llama 2's adoption on the broader open-source AI landscape, addressing challenges and opportunities for developers and researchers in the pursuit of cutting-edge AI solutions. The present study constitutes an early exploration of the Llama 2 pre-trained model, holding promise as a foundational basis for forthcoming research investigations.
Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Robert L. Fry

Abstract: This paper proposes that intelligent processes can be completely explained by thermodynamic principles. They can equally be described by information-theoretic principles that, from the standpoint of the required optimizations, are functionally equivalent. The underlying theory arises from two axioms regarding distinguishability and causality. Their consequence is a theory of computation that applies to the only two kinds of physical processes possible—those that reconstruct the past and those that control the future. Dissipative physical processes fall into the first class, whereas intelligent ones comprise the second. The first kind of process is exothermic and the latter is endothermic. Similarly, the first process dumps entropy and energy to its environment, whereas the second reduces entropy while requiring energy to operate. It is shown that high intelligence efficiency and high energy efficiency are synonymous. The theory suggests the usefulness of developing a new computing paradigm called Thermodynamic Computing to engineer intelligent processes. The described engineering formalism for the design of thermodynamic computers is a hybrid combination of information theory and thermodynamics. Elements of the engineering formalism are introduced in the reverse-engineer of a cortical neuron. The cortical neuron provides perhaps the simplest and most insightful example of a thermodynamic computer possible. It can be seen as a basic building block for constructing more intelligent thermodynamic circuits.
Article
Social Sciences
Government

David Mhlanga

Abstract: Though the share of the world population living in extreme poverty declined to 10 percent in 2015, from 16 percent in 2010 and 36 percent in 1990, data shows that the world is not on track in achieving the target of less than 3 percent of people living in extreme poverty by 2030. Hence the study sought to investigate the influence of AI on poverty reduction. Using content analysis one of the unobtrusive research techniques, the study found out that, the availability of relevant data is making AI be able to deliver value to humanity and AI has a strong influence on poverty in areas of relevant data collection through poverty maps, its ability to revolutionize agriculture, education, and the financial sector through digital financial inclusion. The study also discovered that many countries especially developing nations are not collecting as much data to identify the number of poor people and the regions where these people are located. However, the existence of AI is assisting to change this, or instance the study discovered that the research team at Stanford University is using satellite images to provide an alternative to map poverty, to identify the regions where poverty is more concentrated. Also, various robotics and AI programs such as Google and Stanford University’s Sustainability and Artificial Intelligence Lab, are coming forth with AI programs in agriculture which are doing a lot to improve farming, through the identification of diseases, prediction of crop yields, and location of areas prone to a scarcity among several other notable signs of progress in education. Therefore, the study recommends that governments, development institutions and other organizations that are striving to fight poverty to invest more in AI as well as adopting and scaling up its use as it presents benefits in the quest to ensure that poverty is reduced.

of 3

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated