Continual Learning of Recurrent Neural Networks by Locally Aligning Distributed Representations
Continual Learning of Recurrent Neural Networks by Locally Aligning Distributed Representations
Temporal models based on recurrent neural networks have proven to be quite powerful in a wide variety of applications, including language modeling and speech processing. However, training these models often relies on backpropagation through time (BPTT), which entails unfolding the network over many time steps, making the process of conducting …