To Tune or Not to Tune? Adapting Pretrained Representations to Diverse Tasks

Type: Article

Publication Date: 2019-01-01

Citations: 385

DOI: https://doi.org/10.18653/v1/w19-4302

Locations

  • arXiv (Cornell University) - View - PDF

Similar Works

Action Title Year Authors
+ To Tune or Not to Tune? Adapting Pretrained Representations to Diverse Tasks 2019 Matthew E. Peters
Sebastian Ruder
Noah A. Smith
+ To Tune or Not to Tune? Adapting Pretrained Representations to Diverse Tasks 2019 Matthew E. Peters
Sebastian Ruder
Noah A. Smith
+ Parameter-Efficient Transfer Learning for NLP 2019 Neil Houlsby
Andrei Giurgiu
Stanisław Jastrzȩbski
Bruna Morrone
Quentin de Laroussilhe
Andréa Gesmundo
Mona Attariyan
Sylvain Gelly
+ Parameter-Efficient Transfer Learning for NLP 2019 Neil Houlsby
Andrei Giurgiu
Stanisław Jastrzȩbski
Bruna Morrone
Quentin de Laroussilhe
Andréa Gesmundo
Mona Attariyan
Sylvain Gelly
+ When to Use Multi-Task Learning vs Intermediate Fine-Tuning for Pre-Trained Encoder Transfer Learning 2022 Orion Weller
Kevin Seppi
Matt Gardner
+ When to Use Multi-Task Learning vs Intermediate Fine-Tuning for Pre-Trained Encoder Transfer Learning 2022 Orion Weller
Kevin Seppi
Matt Gardner
+ On the Domain Adaptation and Generalization of Pretrained Language Models: A Survey 2022 Xu Guo
Han Yu
+ On the Effectiveness of Adapter-based Tuning for Pretrained Language Model Adaptation 2021 Ruidan He
Linlin Liu
Hai Ye
Qingyu Tan
Bosheng Ding
Liying Cheng
Jia-Wei Low
Lidong Bing
Luo Si
+ Conditionally Adaptive Multi-Task Learning: Improving Transfer Learning in NLP Using Fewer Parameters & Less Data 2020 Jonathan Pilault
Amine Elhattami
Christopher Pal
+ Conditionally Adaptive Multi-Task Learning: Improving Transfer Learning in NLP Using Fewer Parameters & Less Data 2020 Jonathan Pilault
Amine Elhattami
Christopher Pal
+ AdapterFusion: Non-Destructive Task Composition for Transfer Learning 2020 Jonas Pfeiffer
Aishwarya Kamath
Andreas Rücklé
Kyunghyun Cho
Iryna Gurevych
+ Conditionally Adaptive Multi-Task Learning: Improving Transfer Learning in NLP Using Fewer Parameters & Less Data 2021 Jonathan Pilault
Amine El Hattami
Christopher Pal
+ AdapterFusion: Non-Destructive Task Composition for Transfer Learning 2020 Jonas Pfeiffer
Aishwarya Kamath
Andreas Rücklé
Kyunghyun Cho
Iryna Gurevych
+ Don’t Stop Pretraining: Adapt Language Models to Domains and Tasks 2020 Suchin Gururangan
Ana Marasović
Swabha Swayamdipta
Kyle Lo
Iz Beltagy
Doug Downey
Noah A. Smith
+ Parameter-Efficient Transfer Learning with Diff Pruning 2020 Demi Guo
Alexander M. Rush
Yoon Kim
+ PDF Chat Neural Unsupervised Domain Adaptation in NLP—A Survey 2020 Alan Ramponi
Barbara Plank
+ AdapterBias: Parameter-efficient Token-dependent Representation Shift for Adapters in NLP Tasks 2022 Chin-Lun Fu
Zih-Ching Chen
Yun-Ru Lee
Hung-yi Lee
+ PDF Chat AdapterBias: Parameter-efficient Token-dependent Representation Shift for Adapters in NLP Tasks 2022 Chin-Lun Fu
Zih-Ching Chen
Yun-Ru Lee
Hung-yi Lee
+ Don't Stop Pretraining: Adapt Language Models to Domains and Tasks 2020 Suchin Gururangan
Ana Marasović
Swabha Swayamdipta
Kyle Lo
Iz Beltagy
Doug Downey
Noah A. Smith
+ Don't Stop Pretraining: Adapt Language Models to Domains and Tasks 2020 Suchin Gururangan
Ana Marasović
Swabha Swayamdipta
Kyle Lo
Iz Beltagy
Doug Downey
Noah A. Smith