Generating Sequences With Recurrent Neural Networks
About
This paper shows how Long Short-term Memory recurrent neural networks can be used to generate complex sequences with long-range structure, simply by predicting one data point at a time. The approach is demonstrated for text (where the data are discrete) and online handwriting (where the data are real-valued). It is then extended to handwriting synthesis by allowing the network to condition its predictions on a text sequence. The resulting system is able to generate highly realistic cursive handwriting in a wide variety of styles.
Alex Graves• 2013
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Language Modeling | Penn Treebank (test) | Perplexity115 | 411 | |
| Character-level Language Modeling | enwik8 (test) | BPC1.67 | 195 | |
| Character-level Language Modeling | Penn Treebank (test) | BPC1.24 | 113 | |
| Time Series Forecasting | VISUELLE (test) | WAPE58 | 29 | |
| Character-level Language Modeling | Hutter Prize Wikipedia (test) | Bits/Char1.33 | 28 | |
| Hand Gesture Recognition | EgoGesture (test) | Accuracy74.7 | 21 | |
| Unconditional Audio Generation | Blizzard (test) | NLL (bits)1.434 | 4 | |
| Unconditional Audio Generation | Onomatopoeia (test) | Test NLL (bits)2.034 | 4 | |
| Unconditional Audio Generation | MUSIC (test) | Test NLL (bits)1.41 | 4 |
Showing 9 of 9 rows