Ask a Question

Prefer a chat interface with context about you and your work?

MABEL: Attenuating Gender Bias using Textual Entailment Data

MABEL: Attenuating Gender Bias using Textual Entailment Data

Pre-trained language models encode undesirable social biases, which are further exacerbated in downstream use. To this end, we propose MABEL (a Method for Attenuating Gender Bias using Entailment Labels), an intermediate pre-training approach for mitigating gender bias in contextualized representations. Key to our approach is the use of a contrastive …