Ask a Question

Prefer a chat interface with context about you and your work?

Simplified minimal gated unit variations for recurrent neural networks

Simplified minimal gated unit variations for recurrent neural networks

Recurrent neural networks with various types of hidden units have been used to solve a diverse range of problems involving sequence data. Two of the most recent proposals, gated recurrent units (GRU) and minimal gated units (MGU), have shown comparable promising results on example public datasets. In this paper, we …