Forget Less by Learning Together through Concept Consolidation
About
Custom Diffusion Models (CDMs) have gained significant attention due to their remarkable ability to personalize generative processes. However, existing CDMs suffer from catastrophic forgetting when continuously learning new concepts. Most prior works attempt to mitigate this issue under the sequential learning setting with a fixed order of concept inflow and neglect inter-concept interactions. In this paper, we propose a novel framework - Forget Less by Learning Together (FL2T) - that enables concurrent and order-agnostic concept learning while addressing catastrophic forgetting. Specifically, we introduce a set-invariant inter-concept learning module where proxies guide feature selection across concepts, facilitating improved knowledge retention and transfer. By leveraging inter-concept guidance, our approach preserves old concepts while efficiently incorporating new ones. Extensive experiments, across three datasets, demonstrates that our method significantly improves concept retention and mitigates catastrophic forgetting, highlighting the effectiveness of inter-concept catalytic behavior in incremental concept learning of ten tasks with at least 2% gain on average CLIP Image Alignment scores.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Multi-concept Customization | CIFC (test) | IMS Score89.5 | 22 | |
| Multi-concept Customization | ImageNet (INet) (test) | IMS95.4 | 22 | |
| Multi-concept Customization | CelebA (test) | IMS70.2 | 22 | |
| Continual Image Personalization | CIFC (test) | IA V184.4 | 9 | |
| Continual Image Personalization | CelebA (test) | V1 IA76.4 | 2 | |
| Continual Image Personalization | ImageNet (test) | V1 Image Adaptation Score83.3 | 2 | |
| Multi-concept Customization | CIFC V1 | IA84.4 | 2 | |
| Multi-concept Customization | CIFC V2 | IA86.1 | 2 | |
| Multi-concept Customization | CIFC V3 | IA84.4 | 2 | |
| Multi-concept Customization | CIFC V4 | IA82.2 | 2 |