Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Generating Sequences With Recurrent Neural Networks

About

This paper shows how Long Short-term Memory recurrent neural networks can be used to generate complex sequences with long-range structure, simply by predicting one data point at a time. The approach is demonstrated for text (where the data are discrete) and online handwriting (where the data are real-valued). It is then extended to handwriting synthesis by allowing the network to condition its predictions on a text sequence. The resulting system is able to generate highly realistic cursive handwriting in a wide variety of styles.

Alex Graves• 2013

Related benchmarks

TaskDatasetResultRank
Language ModelingPenn Treebank (test)
Perplexity115
411
Character-level Language Modelingenwik8 (test)
BPC1.67
195
Character-level Language ModelingPenn Treebank (test)
BPC1.24
113
Time Series ForecastingVISUELLE (test)
WAPE58
29
Character-level Language ModelingHutter Prize Wikipedia (test)
Bits/Char1.33
28
Hand Gesture RecognitionEgoGesture (test)
Accuracy74.7
21
Unconditional Audio GenerationBlizzard (test)
NLL (bits)1.434
4
Unconditional Audio GenerationOnomatopoeia (test)
Test NLL (bits)2.034
4
Unconditional Audio GenerationMUSIC (test)
Test NLL (bits)1.41
4
Showing 9 of 9 rows

Other info

Follow for update