Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Learning to Remember More with Less Memorization

About

Memory-augmented neural networks consisting of a neural controller and an external memory have shown potentials in long-term sequential learning. Current RAM-like memory models maintain memory accessing every timesteps, thus they do not effectively leverage the short-term memory held in the controller. We hypothesize that this scheme of writing is suboptimal in memory utilization and introduces redundant computation. To validate our hypothesis, we derive a theoretical bound on the amount of information stored in a RAM-like system and formulate an optimization problem that maximizes the bound. The proposed solution dubbed Uniform Writing is proved to be optimal under the assumption of equal timestep contributions. To relax this assumption, we introduce modifications to the original solution, resulting in a solution termed Cached Uniform Writing. This method aims to balance between maximizing memorization and forgetting via overwriting mechanisms. Through an extensive set of experiments, we empirically demonstrate the advantages of our solutions over other recurrent architectures, claiming the state-of-the-arts in various sequential modeling tasks.

Hung Le, Truyen Tran, Svetha Venkatesh• 2019

Related benchmarks

TaskDatasetResultRank
Image ClassificationMNIST (test)
Accuracy99.1
882
Text ClassificationAGNews
Accuracy93.9
119
Text ClassificationIMDB
Accuracy91.4
107
Image Classificationpermuted MNIST (pMNIST) (test)
Accuracy96.3
63
Document ClassificationYelp Polarity
Accuracy96.4
25
Document ClassificationYahoo Answers
Accuracy74.3
23
Image ClassificationMNIST non-permutation (test)
Accuracy99.1
8
Image ClassificationPMNIST (test)
Accuracy96.3
7
Synthetic CopySynthetic Copy L=50 (test)
Test Accuracy97.7
6
Synthetic CopySynthetic Copy L=100 (test)
Test Accuracy69.3
6
Showing 10 of 18 rows

Other info

Code

Follow for update