Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

DeepSITH: Efficient Learning via Decomposition of What and When Across Time Scales

About

Extracting temporal relationships over a range of scales is a hallmark of human perception and cognition -- and thus it is a critical feature of machine learning applied to real-world problems. Neural networks are either plagued by the exploding/vanishing gradient problem in recurrent neural networks (RNNs) or must adjust their parameters to learn the relevant time scales (e.g., in LSTMs). This paper introduces DeepSITH, a network comprising biologically-inspired Scale-Invariant Temporal History (SITH) modules in series with dense connections between layers. SITH modules respond to their inputs with a geometrically-spaced set of time constants, enabling the DeepSITH network to learn problems along a continuum of time-scales. We compare DeepSITH to LSTMs and other recent RNNs on several time series prediction and decoding tasks. DeepSITH achieves state-of-the-art performance on these problems.

Brandon Jacques, Zoran Tiganj, Marc W. Howard, Per B. Sederberg• 2021

Related benchmarks

TaskDatasetResultRank
Sequential Image ClassificationS-MNIST (test)
Accuracy99.32
70
Sequence ClassificationPS-MNIST (test)
Accuracy97.36
4
Showing 2 of 2 rows

Other info

Follow for update