Preprint Article Version 1 This version is not peer-reviewed

Stochastic Activation Function Layers for Convolutional Neural Networks

Version 1 : Received: 14 February 2020 / Approved: 17 February 2020 / Online: 17 February 2020 (01:50:08 CET)

A peer-reviewed article of this Preprint also exists.

Nanni, L.; Lumini, A.; Ghidoni, S.; Maguolo, G. Stochastic Selection of Activation Layers for Convolutional Neural Networks. Sensors 2020, 20, 1626. Nanni, L.; Lumini, A.; Ghidoni, S.; Maguolo, G. Stochastic Selection of Activation Layers for Convolutional Neural Networks. Sensors 2020, 20, 1626.

Journal reference: Sensors 2020, 20, 1626
DOI: 10.3390/s20061626

Abstract

In recent years, the field of deep learning achieved considerable success in pattern recognition, image segmentation and may other classification fields. There are a lot of studies and practical applications of deep learning on images, video or text classification. In this study, we suggest a method for changing the architecture of the most performing CNN models with the aim of designing new models to be used as stand-alone networks or as a component of an ensemble. We propose to replace each activation layer of a CNN (usually a ReLu layer) by a different activation function stochastically drawn from a set of activation functions: in this way the resulting CNN has a different set of activation function layers.

Subject Areas

Convolutional Neural Networks; ensemble of classifiers; activation functions; image classification; skin detection

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our diversity statement.

Leave a public comment
Send a private comment to the author(s)
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.