Ask a Question

Prefer a chat interface with context about you and your work?

Rotate your Networks: Better Weight Consolidation and Less Catastrophic Forgetting

Rotate your Networks: Better Weight Consolidation and Less Catastrophic Forgetting

In this paper we propose an approach to avoiding catastrophic forgetting in sequential task learning scenarios. Our technique is based on a network reparameterization that approximately diagonalizes the Fisher Information Matrix of the network parameters. This reparameterization takes the form of a factorized rotation of parameter space which, when used …