Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Trainable Activations for Image Classification

Version 1 : Received: 22 January 2023 / Approved: 26 January 2023 / Online: 26 January 2023 (02:59:29 CET)

How to cite: Pishchik, E. Trainable Activations for Image Classification. Preprints 2023, 2023010463. Pishchik, E. Trainable Activations for Image Classification. Preprints 2023, 2023010463.


Non-linear activation functions are one of the main parts of deep neural network architectures. The choice of the activation function can affect model speed, performance and convergence. Most popular activation functions don't have any trainable parameters and don't alter during the training. We propose different activation functions with and without trainable parameters. Said activation functions have a number of advantages and disadvantages. We'll be testing the performance of said activation functions and comparing the results with widely known activation function ReLU. We assume that the activation functions with trainable parameters can outperform functions without ones, because the trainable parameters allow the model to "select'' the type of each of the activation functions itself, however, this strongly depends on the architecture of the deep neural network and the activation function itself.


Trainable Activations, Trainable Activation Functions, CosLU, DELU, LinComb, NormLinComb, ReLUN, ScaledSoftSign, ShiLU


Computer Science and Mathematics, Computer Vision and Graphics

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0

Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.