Version 1
: Received: 14 May 2023 / Approved: 15 May 2023 / Online: 15 May 2023 (10:13:00 CEST)
How to cite:
Kiaei, A.A.; Boush, M.; Safaei, D.; Abadijou, S.; Baselizadeh, N.; Salari, N.; Mohammadi, M. Active Identity Function as Activation Function. Preprints2023, 2023051018. https://doi.org/10.20944/preprints202305.1018.v1
Kiaei, A.A.; Boush, M.; Safaei, D.; Abadijou, S.; Baselizadeh, N.; Salari, N.; Mohammadi, M. Active Identity Function as Activation Function. Preprints 2023, 2023051018. https://doi.org/10.20944/preprints202305.1018.v1
Kiaei, A.A.; Boush, M.; Safaei, D.; Abadijou, S.; Baselizadeh, N.; Salari, N.; Mohammadi, M. Active Identity Function as Activation Function. Preprints2023, 2023051018. https://doi.org/10.20944/preprints202305.1018.v1
APA Style
Kiaei, A.A., Boush, M., Safaei, D., Abadijou, S., Baselizadeh, N., Salari, N., & Mohammadi, M. (2023). Active Identity Function as Activation Function. Preprints. https://doi.org/10.20944/preprints202305.1018.v1
Chicago/Turabian Style
Kiaei, A.A., Nader Salari and Masoud Mohammadi. 2023 "Active Identity Function as Activation Function" Preprints. https://doi.org/10.20944/preprints202305.1018.v1
Abstract
Selecting the optimal activation function for training deep neural networks has always been challenging because it significantly impacts the neural network’s performance and training speed. At this point, researchers are more likely to employ RELU than other well-known activation functions. After RELU, researchers have proposed many activation functions to improve RELU. None of them was capable of surpassing RELU as their most significant rival. SWISH outperformed RELU in several challenging experiments like classification, object detection, and tracking. Replacing RELU units with SWISH, which improves performance in many tasks. This paper proposed a new activation function surpassing Google’s brain’s SWISH function, Which we named AIF. Experiments indicate that our proposed activation function outperforms SWISH in various tasks.
Keywords
Activation Function; Neural Network; Deep Learning; RELU; SWISH
Subject
Computer Science and Mathematics, Artificial Intelligence and Machine Learning
Copyright:
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.