Ask a Question

Prefer a chat interface with context about you and your work?

A Primer on Contrastive Pretraining in Language Processing: Methods, Lessons Learned, and Perspectives

A Primer on Contrastive Pretraining in Language Processing: Methods, Lessons Learned, and Perspectives

Modern natural language processing (NLP) methods employ self-supervised pretraining objectives such as masked language modeling to boost the performance of various downstream tasks. These pretraining methods are frequently extended with recurrence, adversarial, or linguistic property masking. Recently, contrastive self-supervised training objectives have enabled successes in image representation pretraining by learning …