Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Improving Compositional Generalization Using Iterated Learning and Simplicial Embeddings

About

Compositional generalization, the ability of an agent to generalize to unseen combinations of latent factors, is easy for humans but hard for deep neural networks. A line of research in cognitive science has hypothesized a process, ``iterated learning,'' to help explain how human language developed this ability; the theory rests on simultaneous pressures towards compressibility (when an ignorant agent learns from an informed one) and expressivity (when it uses the representation for downstream tasks). Inspired by this process, we propose to improve the compositional generalization of deep networks by using iterated learning on models with simplicial embeddings, which can approximately discretize representations. This approach is further motivated by an analysis of compositionality based on Kolmogorov complexity. We show that this combination of changes improves compositional generalization over other approaches, demonstrating these improvements both on vision tasks with well-understood latent factors and on real molecular graph prediction tasks where the latent structure is unknown.

Yi Ren, Samuel Lavoie, Mikhail Galkin, Danica J. Sutherland, Aaron Courville• 2023

Related benchmarks

TaskDatasetResultRank
Graph RegressionOGB-LSC PCQM4M v2 (val)
MAE0.098
81
Small molecule classificationOGBG-MOLHIV (test)
ROC-AUC79.09
19
Binary Classificationogbg-molhiv full (val)
AUROC84.89
8
Binary Classificationogbg-molhiv half (val)
AUROC78.48
8
Binary Classificationogbg-molhiv half (test)
AUROC74.02
8
Binary Classificationogbg-molpcba full (val)
AP29.3
8
Binary Classificationogbg-molpcba full (test)
Average Precision28.02
8
Binary Classificationogbg-molpcba half (val)
Average Precision24.41
8
Binary Classificationogbg-molpcba half (test)
AP23.89
8
Showing 9 of 9 rows

Other info

Follow for update