Ask a Question

Prefer a chat interface with context about you and your work?

Attention-Guided Answer Distillation for Machine Reading Comprehension

Attention-Guided Answer Distillation for Machine Reading Comprehension

Despite that current reading comprehension systems have achieved significant advancements, their promising performances are often obtained at the cost of making an ensemble of numerous models. Besides, existing approaches are also vulnerable to adversarial attacks. This paper tackles these problems by leveraging knowledge distillation, which aims to transfer knowledge from …