Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Gaussian Error Linear Units (GELUs)

About

We propose the Gaussian Error Linear Unit (GELU), a high-performing neural network activation function. The GELU activation function is $x\Phi(x)$, where $\Phi(x)$ the standard Gaussian cumulative distribution function. The GELU nonlinearity weights inputs by their value, rather than gates inputs by their sign as in ReLUs ($x\mathbf{1}_{x>0}$). We perform an empirical evaluation of the GELU nonlinearity against the ReLU and ELU activations and find performance improvements across all considered computer vision, natural language processing, and speech tasks.

Dan Hendrycks, Kevin Gimpel• 2016

Related benchmarks

TaskDatasetResultRank
Graph ClassificationPROTEINS
Accuracy76.6
994
Graph ClassificationMUTAG
Accuracy90.9
862
Image ClassificationFashion MNIST (test)
Accuracy89.84
592
Language ModelingWikiText-103 (test)
Perplexity15.82
579
Graph ClassificationNCI1
Accuracy83.5
501
Image ClassificationSVHN (test)--
401
Graph ClassificationNCI109
Accuracy82.9
223
Image ClassificationMNIST (test)--
196
Graph ClassificationPTC
Accuracy65.4
167
Image ClassificationCIFAR100-LT (test)--
45
Showing 10 of 30 rows

Other info

Follow for update