Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

PPSEBM: An Energy-Based Model with Progressive Parameter Selection for Continual Learning

About

Continual learning remains a fundamental challenge in machine learning, requiring models to learn from a stream of tasks without forgetting previously acquired knowledge. A major obstacle in this setting is catastrophic forgetting, where performance on earlier tasks degrades as new tasks are learned. In this paper, we introduce PPSEBM, a novel framework that integrates an Energy-Based Model (EBM) with Progressive Parameter Selection (PPS) to effectively address catastrophic forgetting in continual learning for natural language processing tasks. In PPSEBM, progressive parameter selection allocates distinct, task-specific parameters for each new task, while the EBM generates representative pseudo-samples from prior tasks. These generated samples actively inform and guide the parameter selection process, enhancing the model's ability to retain past knowledge while adapting to new tasks. Experimental results on diverse NLP benchmarks demonstrate that PPSEBM outperforms state-of-the-art continual learning methods, offering a promising and robust solution to mitigate catastrophic forgetting.

Xiaodi Li, Dingcheng Li, Rujun Gao, Mahmoud Zamani, Feng Mi, Latifur Khan• 2025

Related benchmarks

TaskDatasetResultRank
Text ClassificationAGNews, Amazon, DBPedia, Yahoo, and Yelp (test)
Exact Match (EM)80.8
55
Lifelong LearningSST, QA-SRL, and WOZ Permuted Sequences GPT-2 models (test)
Accuracy (SRL WOZ SST)82.2
28
Continual LearningSelfRC, TweetQA, and SST sequence SQuAD format (test)
Average EM80.9
16
Multitask Natural Language ProcessingDecaNLP SQuAD 2.0, WikiSQL, SST, QA-SRL, WOZ (test)
Average Score77.4
11
Showing 4 of 4 rows

Other info

Follow for update