Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Scalable Training of Artificial Neural Networks with Adaptive Sparse Connectivity inspired by Network Science

About

Through the success of deep learning in various domains, artificial neural networks are currently among the most used artificial intelligence methods. Taking inspiration from the network properties of biological neural networks (e.g. sparsity, scale-freeness), we argue that (contrary to general practice) artificial neural networks, too, should not have fully-connected layers. Here we propose sparse evolutionary training of artificial neural networks, an algorithm which evolves an initial sparse topology (Erd\H{o}s-R\'enyi random graph) of two consecutive layers of neurons into a scale-free topology, during learning. Our method replaces artificial neural networks fully-connected layers with sparse ones before training, reducing quadratically the number of parameters, with no decrease in accuracy. We demonstrate our claims on restricted Boltzmann machines, multi-layer perceptrons, and convolutional neural networks for unsupervised and supervised learning on 15 datasets. Our approach has the potential to enable artificial neural networks to scale up beyond what is currently possible.

Decebal Constantin Mocanu, Elena Mocanu, Peter Stone, Phuong H. Nguyen, Madeleine Gibescu, Antonio Liotta• 2017

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-100 (test)
Accuracy76.14
3518
Image ClassificationCIFAR-10 (test)
Accuracy94.65
3381
Image ClassificationImageNet-1k (val)
Top-1 Accuracy72.6
1453
Image GenerationCIFAR-10 (test)--
471
Image GenerationCIFAR-10
Inception Score8.98
178
Image SynthesisCIFAR-10
FID8.01
79
Image GenerationSTL-10
FID30.8
66
Generative Image SynthesisCIFAR-10 BigGAN
FID8.01
62
Generative Image SynthesisCIFAR-10 SNGAN
FID10.68
62
Generative Image SynthesisSTL-10 SNGAN
FID30.37
62
Showing 10 of 22 rows

Other info

Follow for update