Preprint
Article

This version is not peer-reviewed.

Bayesian PASA: Provably Stable Adaptive Activation with Uncertainty Quantification

Submitted:

07 March 2026

Posted:

10 March 2026

You are already at the latest version

Abstract
The choice of activation function is a fundamental design deci-sion in deep learning, yet most popular options like ReLU, GELU, or Swish are static and treat all inputs uniformly. This one-size-fits-all approach breaks down in the presence of noisy or corrupted data, where the optimal non-linearity should depend on the input’s statistical context. In this paper, we introduce Bayesian Probabilistic Adaptive Sigmoidal Activation (Bayesian PASA), a novel activation function that dynamically adapts its behavior based on the input’s uncertainty. Bayesian PASA is not just a new function, but a new paradigm. It frames activation selection as a Bayesian model averag-ing problem, adaptively mixing sigmoidal, linear, and noise-aware be-haviors. The mixing weights are derived from a principled variational evidence lower bound (ELBO), regularized by a stable ψ-function that guarantees bounded influence from noise estimates. We provide three formal theorems proving its Lipschitz continuity, gradient stability, and convergence under standard training assumptions. On the challeng-ing CIFAR-100 benchmark, Bayesian PASA achieves a state-of-the-art test accuracy of 76.38%, outperforming ReLU (75.68%), GELU (75.98%), and the original PASA (75.53%). On the corrupted CIFAR-10-C dataset, the full Bayesian PASA model combined with Bayesian R-LayerNorm achieves an average accuracy of 53.91%, a +1.87% improvement over the ReLU+LayerNorm baseline. This work pro-vides a drop-in replacement for existing activations, offering not only improved performance but also built-in uncertainty quantification for more robust deep learning systems.
Keywords: 
;  ;  ;  ;  ;  ;  ;  
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated