Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Professor Forcing: A New Algorithm for Training Recurrent Networks

About

The Teacher Forcing algorithm trains recurrent networks by supplying observed sequence values as inputs during training and using the network's own one-step-ahead predictions to do multi-step sampling. We introduce the Professor Forcing algorithm, which uses adversarial domain adaptation to encourage the dynamics of the recurrent network to be the same when training the network and when sampling from the network over multiple time steps. We apply Professor Forcing to language modeling, vocal synthesis on raw waveforms, handwriting generation, and image generation. Empirically we find that Professor Forcing acts as a regularizer, improving test likelihood on character level Penn Treebank and sequential MNIST. We also find that the model qualitatively improves samples, especially when sampling for a large number of time steps. This is supported by human evaluation of sample quality. Trade-offs between Professor Forcing and Scheduled Sampling are discussed. We produce T-SNEs showing that Professor Forcing successfully makes the dynamics of the network during training and sampling more similar.

Alex Lamb, Anirudh Goyal, Ying Zhang, Saizheng Zhang, Aaron Courville, Yoshua Bengio• 2016

Related benchmarks

TaskDatasetResultRank
Time-series generationSines
Discriminative Score0.495
21
Time-series generationEnergy
Discriminative Score0.553
21
Time-series generationStocks
Discriminative Score0.257
21
Time-series generationChickenpox
MAE0.319
12
Time-series generationAIR
Predictive Score (MAE)0.19
12
Time-series generationGAS
Predictive Score0.037
7
Time-series generationMetro
Predictive Score0.241
7
Time-series generationMIMIC-III
Predictive Score0.023
7
Showing 8 of 8 rows

Other info

Follow for update