Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Analyzing the Training Dynamics of Image Restoration Transformers: A Revisit to Layer Normalization

About

This work analyzes the training dynamics of Image Restoration (IR) Transformers and uncovers a critical yet overlooked issue: conventional LayerNorm (LN) drives feature magnitudes to diverge to a million scale and collapses channel-wise entropy. We analyze this in the perspective of networks attempting to bypass LN's constraints that conflict with IR tasks. Accordingly, we address two misalignments between LN and IR: 1) per-token normalization disrupts spatial correlations, and 2) input-independent scaling discards input-specific statistics. To address this, we propose Image Restoration Transformer Tailored Layer Normalization i-LN, a simple drop-in replacement that normalizes features holistically and adaptively rescales them per input. We provide theoretical insights and empirical evidence that this simple design effectively leads to both improved training dynamics and thereby improved performance, validated by extensive experiments.

MinKyu Lee, Sangeek Hyun, Woojin Jun, Hyunjun Kim, Jiwoo Chung, Jae-Pil Heo• 2025

Related benchmarks

TaskDatasetResultRank
Classic Image Super-ResolutionSet5
PSNR38.65
83
Classic Image Super-ResolutionSet14
PSNR34.92
70
Classical Image Super-ResolutionUrban100
PSNR34.6
11
Classical Image Super-ResolutionBSD100
PSNR32.63
11
Classical Image Super-ResolutionManga109
PSNR40.38
4
Showing 5 of 5 rows

Other info

Follow for update