Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Comparison of Different Convolutional Neural Network Activation Functions and Methods for Building Ensembles

Version 1 : Received: 5 March 2021 / Approved: 5 March 2021 / Online: 5 March 2021 (10:05:38 CET)

How to cite: Nanni, L.; Brahnam, S.; Paci, M.; Maguolo, G. Comparison of Different Convolutional Neural Network Activation Functions and Methods for Building Ensembles. Preprints 2021, 2021030180. https://doi.org/10.20944/preprints202103.0180.v1 Nanni, L.; Brahnam, S.; Paci, M.; Maguolo, G. Comparison of Different Convolutional Neural Network Activation Functions and Methods for Building Ensembles. Preprints 2021, 2021030180. https://doi.org/10.20944/preprints202103.0180.v1

Abstract

Recently, much attention has been devoted to finding highly efficient and powerful activation functions for CNN layers. Because activation functions inject different nonlinearities between layers that affect performance, varying them is one method for building robust ensembles of CNNs. The objective of this study is to examine the performance of CNN ensembles made with different activation functions, including six new ones presented here: 2D Mexican ReLU, TanELU, MeLU+GaLU, Symmetric MeLU, Symmetric GaLU, and Flexible MeLU. The highest performing ensemble was built with CNNs having different activation layers that randomly replaced the standard ReLU. A comprehensive evaluation of the proposed approach was conducted across fifteen biomedical data sets representing various classification tasks. The proposed method was tested on two basic CNN architectures: Vgg16 and ResNet50. Results demonstrate the superiority in performance of this approach. The MATLAB source code for this study will be available at https://github.com/LorisNanni.

Keywords

convolutional neural networks; activation functions; biomedical classification; ensembles; MeLU variants

Subject

Computer Science and Mathematics, Algebra and Number Theory

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.