Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Bringing Old Films Back to Life

About

We present a learning-based framework, recurrent transformer network (RTN), to restore heavily degraded old films. Instead of performing frame-wise restoration, our method is based on the hidden knowledge learned from adjacent frames that contain abundant information about the occlusion, which is beneficial to restore challenging artifacts of each frame while ensuring temporal coherency. Moreover, contrasting the representation of the current frame and the hidden knowledge makes it possible to infer the scratch position in an unsupervised manner, and such defect localization generalizes well to real-world degradations. To better resolve mixed degradation and compensate for the flow estimation error during frame alignment, we propose to leverage more expressive transformer blocks for spatial restoration. Experiments on both synthetic dataset and real-world old films demonstrate the significant superiority of the proposed RTN over existing solutions. In addition, the same framework can effectively propagate the color from keyframes to the whole video, ultimately yielding compelling restored films. The implementation and model will be released at https://github.com/raywzy/Bringing-Old-Films-Back-to-Life.

Ziyu Wan, Bo Zhang, Dongdong Chen, Jing Liao• 2022

Related benchmarks

TaskDatasetResultRank
Video RestorationSynthetic DAVIS (test)
PSNR24.465
10
Video RestorationReal-world old films
NIQE15.4254
7
Video RestorationReal-world dataset
BRISQUE53.27
7
Video RestorationSynthetic Dataset (test)
PSNR31.46
7
Video ColorizationREDS
PSNR32.838
5
Showing 5 of 5 rows

Other info

Code

Follow for update