Ask a Question

Prefer a chat interface with context about you and your work?

Transformer Based Grapheme-to-Phoneme Conversion

Transformer Based Grapheme-to-Phoneme Conversion

Attention mechanism is one of the most successful techniques in deep learning based Natural Language Processing (NLP).The transformer network architecture is completely based on attention mechanisms, and it outperforms sequence-to-sequence models in neural machine translation without recurrent and convolutional layers.Grapheme-tophoneme (G2P) conversion is a task of converting letters (grapheme sequence) …