Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Learning without Forgetting

About

When building a unified vision system or gradually adding new capabilities to a system, the usual assumption is that training data for all tasks is always available. However, as the number of tasks grows, storing and retraining on such data becomes infeasible. A new problem arises where we add new capabilities to a Convolutional Neural Network (CNN), but the training data for its existing capabilities are unavailable. We propose our Learning without Forgetting method, which uses only new task data to train the network while preserving the original capabilities. Our method performs favorably compared to commonly used feature extraction and fine-tuning adaption techniques and performs similarly to multitask learning that uses original task data we assume unavailable. A more surprising observation is that Learning without Forgetting may be able to replace fine-tuning with similar old and new task datasets for improved new task performance.

Zhizhong Li, Derek Hoiem• 2016

Related benchmarks

TaskDatasetResultRank
Person Re-IdentificationMarket 1501
mAP65.1
1071
Image ClassificationCIFAR-100
Accuracy75.67
691
Person Re-IdentificationMSMT17
mAP0.094
514
Depth EstimationNYU v2 (test)--
432
ClassificationCars
Accuracy74
395
Image ClassificationCIFAR-100
Accuracy68.22
302
Person Re-IdentificationCUHK03
R153.6
284
Image ClassificationCUB
Accuracy69.34
282
Semantic segmentationNYU v2 (test)
mIoU38.06
282
Surface Normal EstimationNYU v2 (test)
Mean Angle Distance (MAD)31.84
224
Showing 10 of 555 rows
...

Other info

Follow for update