Overcoming Catastrophic Forgetting With Unlabeled Data in the Wild
Overcoming Catastrophic Forgetting With Unlabeled Data in the Wild
Lifelong learning with deep neural networks is well-known to suffer from catastrophic forgetting: the performance on previous tasks drastically degrades when learning a new task. To alleviate this effect, we propose to leverage a large stream of unlabeled data easily obtainable in the wild. In particular, we design a novel …