Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

All-to-key Attention for Arbitrary Style Transfer

About

Attention-based arbitrary style transfer studies have shown promising performance in synthesizing vivid local style details. They typically use the all-to-all attention mechanism -- each position of content features is fully matched to all positions of style features. However, all-to-all attention tends to generate distorted style patterns and has quadratic complexity, limiting the effectiveness and efficiency of arbitrary style transfer. In this paper, we propose a novel all-to-key attention mechanism -- each position of content features is matched to stable key positions of style features -- that is more in line with the characteristics of style transfer. Specifically, it integrates two newly proposed attention forms: distributed and progressive attention. Distributed attention assigns attention to key style representations that depict the style distribution of local regions; Progressive attention pays attention from coarse-grained regions to fine-grained key positions. The resultant module, dubbed StyA2K, shows extraordinary performance in preserving the semantic structure and rendering consistent style patterns. Qualitative and quantitative comparisons with state-of-the-art methods demonstrate the superior performance of our approach.

Mingrui Zhu, Xiao He, Nannan Wang, Xiaoyu Wang, Xinbo Gao• 2022

Related benchmarks

TaskDatasetResultRank
Style TransferMS-COCO (content) + WikiArt (style) (test)
LPIPS0.5674
31
Showing 1 of 1 rows

Other info

Follow for update