Lifelong Pretraining: Continually Adapting Language Models to Emerging Corpora
Lifelong Pretraining: Continually Adapting Language Models to Emerging Corpora
Xisen Jin, Dejiao Zhang, Henghui Zhu, Wei Xiao, Shang-Wen Li, Xiaokai Wei, Andrew Arnold, Xiang Ren. Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 2022.