Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Learning Modality Knowledge Alignment for Cross-Modality Transfer

About

Cross-modality transfer aims to leverage large pretrained models to complete tasks that may not belong to the modality of pretraining data. Existing works achieve certain success in extending classical finetuning to cross-modal scenarios, yet we still lack understanding about the influence of modality gap on the transfer. In this work, a series of experiments focusing on the source representation quality during transfer are conducted, revealing the connection between larger modality gap and lesser knowledge reuse which means ineffective transfer. We then formalize the gap as the knowledge misalignment between modalities using conditional distribution P(Y|X). Towards this problem, we present Modality kNowledge Alignment (MoNA), a meta-learning approach that learns target data transformation to reduce the modality knowledge discrepancy ahead of the transfer. Experiments show that out method enables better reuse of source modality knowledge in cross-modality transfer, which leads to improvements upon existing finetuning methods.

Wenxuan Ma, Shuang Li, Lincan Cai, Jingxuan Kang• 2024

Related benchmarks

TaskDatasetResultRank
PDE solvingPDEBench Diff.Reac 1D (test)
nRMSE0.0028
13
Cross-modal adaptationNAS-Bench-360
Darcy (Relative L2)0.0068
9
PDE solvingPDEBench Diff.Sorp (test)
nRMSE0.0016
9
PDE solvingPDEBench Advection (test)
nRMSE0.0088
9
Diverse Prediction TasksNAS-Bench-360 (test)
Darcy Score0.0068
9
PDE solvingPDEBench Darcy (test)
nRMSE0.079
8
DarcyPDEBench
nRMSE0.079
5
PDE solvingPDEBench Diffusion Sorption (1D)
nRMSE0.0016
5
PDE solvingPDEBench Diffusion Reaction (1D)
nRMSE0.0028
5
Aggregate Performance RankingPDEBench Multiple Tasks
Avg Rank1.875
5
Showing 10 of 19 rows

Other info

Follow for update