Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Searching for Activation Functions

About

The choice of activation functions in deep networks has a significant effect on the training dynamics and task performance. Currently, the most successful and widely-used activation function is the Rectified Linear Unit (ReLU). Although various hand-designed alternatives to ReLU have been proposed, none have managed to replace it due to inconsistent gains. In this work, we propose to leverage automatic search techniques to discover new activation functions. Using a combination of exhaustive and reinforcement learning-based search, we discover multiple novel activation functions. We verify the effectiveness of the searches by conducting an empirical evaluation with the best discovered activation function. Our experiments show that the best discovered activation function, $f(x) = x \cdot \text{sigmoid}(\beta x)$, which we name Swish, tends to work better than ReLU on deeper models across a number of challenging datasets. For example, simply replacing ReLUs with Swish units improves top-1 classification accuracy on ImageNet by 0.9\% for Mobile NASNet-A and 0.6\% for Inception-ResNet-v2. The simplicity of Swish and its similarity to ReLU make it easy for practitioners to replace ReLUs with Swish units in any neural network.

Prajit Ramachandran, Barret Zoph, Quoc V. Le• 2017

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-100 (test)--
3518
Graph ClassificationPROTEINS
Accuracy76.2
742
Graph ClassificationMUTAG
Accuracy90.4
697
Image ClassificationCIFAR10 (test)--
585
Graph ClassificationNCI1
Accuracy83.4
460
Graph ClassificationNCI109
Accuracy82.9
223
Image ClassificationFashionMNIST (test)--
218
Graph ClassificationPTC
Accuracy65.1
167
Graph ClassificationMOLTOX21
ROC-AUC0.7331
38
Image ClassificationMNIST (train)
Train Error Rate0.146
37
Showing 10 of 40 rows

Other info

Follow for update