Preprint
Article

This version is not peer-reviewed.

Bayesian PASA: Adaptive Activation with Uncertainty Quantification under Computational Constraints – Revised Version

Submitted:

10 March 2026

Posted:

11 March 2026

You are already at the latest version

Abstract
The choice of activation function is a fundamental design decision in deep learning, yet most popular options like ReLU, GELU, or Swish are static and treat all inputs uniformly. This one-size-fits-all approach can be suboptimal when data contains heterogeneous noise, where the ideal non-linearity might depend on the input’s statistical context. In this paper, we introduce Bayesian Probabilistic Adaptive Sigmoidal Activation (Bayesian PASA), an activation function that adapts its behavior based on input uncertainty. The method frames activation selection as a Bayesian model averaging problem, adaptively mixing sigmoidal, linear, and noise-aware behaviors. Mixing weights are derived from a variational evidence lower bound (ELBO) and regularized by a ψ-function that bounds the influence of local noise estimates. We provide theo- retical analysis showing Lipschitz continuity, gradient bounds, and convergence under standard assumptions. Due to computational constraints (Google Colab, limited epochs), we evaluate Bayesian PASA on CIFAR-100 (50 epochs, 3 seeds) and CIFAR-10-C (100 epochs, 3 seeds). Despite these limitations, Bayesian PASA achieves 76.38% accuracy on CIFAR-100, slightly outperforming ReLU (75.68%) and GELU (75.98%) under the same constrained conditions. On corrupted CIFAR-10-C, Bayesian PASA combined with Bayesian R-LayerNorm achieves an average accuracy of 53.91%, a +1.87% improvement over the ReLU+LayerNorm baseline. These results, though modest, are consistent across seeds and suggest that Bayesian PASA offers a promising direction for uncertainty-aware activation functions, particularly when training re-sources are limited. Code is available at: https://github.com/BayesianPASA/BayesianPASA/
Keywords: 
;  ;  ;  ;  ;  ;  ;  
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated