Preprint
Article

This version is not peer-reviewed.

Self-Evolving Machine Learning Models via Meta-Learning and Neural Architecture Search

Submitted:

27 April 2026

Posted:

28 April 2026

You are already at the latest version

Abstract
Despite recent advances in artificial intelligence, static deep learning models still struggle in non-stationary real-world environments because of concept drift. This paper presents a framework for Self-Evolving Machine Learning Models (SE-MLM) that combines the rapid adaptability of meta-learning with the structural flexibility of Neural Architecture Search (NAS). Unlike train-once approaches that require manual retraining afterdrift our framework enables the model to update itself through a bi-level optimization process: an inner loop adapts weights using meta-gradients, and an outer loop refines the architecture through a continuous relaxation of the search space. Experiments on CIFAR-10, CIFAR-100, and Rotated-MNIST show that SE-MLM recovers up to 98% of baseline performance within minutes of a drift event and consistently outperforms static base- lines. We also discuss practical applications in healthcare monitoring and high-frequency trading, along with future directions in “Green AI” and explainability.
Keywords: 
;  ;  ;  ;  ;  ;  
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated