Multi-Scale Alignment and Contextual History for Attention Mechanism in Sequence-to-Sequence Model
Multi-Scale Alignment and Contextual History for Attention Mechanism in Sequence-to-Sequence Model
A sequence-to-sequence model is a neural network module for mapping two sequences of different lengths. The sequence-to-sequence model has three core modules: encoder, decoder, and attention. Attention is the bridge that connects the encoder and decoder modules and improves model performance in many tasks. In this paper, we propose two …