Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Invariant Learning via Probability of Sufficient and Necessary Causes

About

Out-of-distribution (OOD) generalization is indispensable for learning models in the wild, where testing distribution typically unknown and different from the training. Recent methods derived from causality have shown great potential in achieving OOD generalization. However, existing methods mainly focus on the invariance property of causes, while largely overlooking the property of \textit{sufficiency} and \textit{necessity} conditions. Namely, a necessary but insufficient cause (feature) is invariant to distribution shift, yet it may not have required accuracy. By contrast, a sufficient yet unnecessary cause (feature) tends to fit specific data well but may have a risk of adapting to a new domain. To capture the information of sufficient and necessary causes, we employ a classical concept, the probability of sufficiency and necessary causes (PNS), which indicates the probability of whether one is the necessary and sufficient cause. To associate PNS with OOD generalization, we propose PNS risk and formulate an algorithm to learn representation with a high PNS value. We theoretically analyze and prove the generalizability of the PNS risk. Experiments on both synthetic and real-world benchmarks demonstrate the effectiveness of the proposed method. The details of the implementation can be found at the GitHub repository: https://github.com/ymy4323460/CaSN.

Mengyue Yang, Zhen Fang, Yonggang Zhang, Yali Du, Furui Liu, Jean-Francois Ton, Jianhong Wang, Jun Wang• 2023

Related benchmarks

TaskDatasetResultRank
Domain GeneralizationPACS
Accuracy (Art)87.1
221
Domain GeneralizationVLCS
Accuracy (L)65.9
27
Image ClassificationColored MNIST
Accuracy (+90% Threshold)72.6
9
Domain GeneralizationSpuCoAnimal (test)
Avg Accuracy84.34
5
Showing 4 of 4 rows

Other info

Code

Follow for update