Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Activation Functions in Deep Learning: A Comprehensive Survey and Benchmark

About

Neural networks have shown tremendous growth in recent years to solve numerous problems. Various types of neural networks have been introduced to deal with different types of problems. However, the main goal of any neural network is to transform the non-linearly separable input data into more linearly separable abstract features using a hierarchy of layers. These layers are combinations of linear and nonlinear functions. The most popular and common non-linearity layers are activation functions (AFs), such as Logistic Sigmoid, Tanh, ReLU, ELU, Swish and Mish. In this paper, a comprehensive overview and survey is presented for AFs in neural networks for deep learning. Different classes of AFs such as Logistic Sigmoid and Tanh based, ReLU based, ELU based, and Learning based are covered. Several characteristics of AFs such as output range, monotonicity, and smoothness are also pointed out. A performance comparison is also performed among 18 state-of-the-art AFs with different networks on different types of data. The insights of AFs are presented to benefit the researchers for doing further research and practitioners to select among different choices. The code used for experimental comparison is released at: \url{https://github.com/shivram1987/ActivationFunctions}.

Shiv Ram Dubey, Satish Kumar Singh, Bidyut Baran Chaudhuri• 2021

Related benchmarks

TaskDatasetResultRank
Classificationkc1
Balanced Accuracy63.21
18
ClassificationThyroid
F1 Score95.2
17
ClassificationILPD
F1 Score61.2
10
ClassificationPage-blocks
F1 Score83.8
10
ClassificationSynthetic
F1 Score94.71
10
ClassificationPen Digits
F1 Score99.3
10
Showing 6 of 6 rows

Other info

Follow for update