Linting Xue

Follow

Generating author description...

All published works
Action Title Year Authors
+ PaLM 2 Technical Report 2023 Rohan Anil
Andrew M. Dai
Orhan Fırat
Melvin Johnson
Dmitry Lepikhin
A. M. A. dos Passos
Siamak Shakeri
Emanuel Taropa
Paige Bailey
Zhifeng Chen
+ MaXM: Towards Multilingual Visual Question Answering 2023 Soravit Changpinyo
Linting Xue
Michal Yarom
Ashish V. Thapliyal
Idan Szpektor
Julien Amelot
Xi Chen
Radu Soricut
+ Gemini: A Family of Highly Capable Multimodal Models 2023 Gemini Team
Rohan Anil
Sebastian Borgeaud
Jean-Baptiste Alayrac
Jiahui Yu
Radu Soricut
Johan Schalkwyk
Andrew M. Dai
Anja Hauth
Katie Millican
+ PDF Chat ByT5: Towards a Token-Free Future with Pre-trained Byte-to-Byte Models 2022 Linting Xue
Aditya Barua
Noah Constant
Rami Al‐Rfou
Sharan Narang
Mihir Kale
Adam P. Roberts
Colin Raffel
+ PaLI: A Jointly-Scaled Multilingual Language-Image Model 2022 Xi Chen
Xiao Wang
Soravit Changpinyo
AJ Piergiovanni
Piotr Padlewski
Daniel Salz
Sebastian Goodman
Adam Grycner
Basil Mustafa
Lucas Beyer
+ MaXM: Towards Multilingual Visual Question Answering 2022 Soravit Changpinyo
Linting Xue
Idan Szpektor
Ashish V. Thapliyal
Julien Amelot
Xi Chen
Radu Soricut
+ nmT5 -- Is parallel data still relevant for pre-training massively multilingual language models? 2021 Mihir Kale
Aditya Siddhant
Noah Constant
Melvin Johnson
Rami Al‐Rfou
Linting Xue
+ PDF Chat mT5: A Massively Multilingual Pre-trained Text-to-Text Transformer 2021 Linting Xue
Noah Constant
Adam P. Roberts
Mihir Kale
Rami Al‐Rfou
Aditya Siddhant
Aditya Barua
Colin Raffel
+ Towards Zero-Shot Multilingual Synthetic Question and Answer Generation for Cross-Lingual Reading Comprehension 2021 Siamak Shakeri
Noah Constant
Mihir Kale
Linting Xue
+ nmT5 - Is parallel data still relevant for pre-training massively multilingual language models? 2021 Mihir Kale
Aditya Siddhant
Rami Al‐Rfou
Linting Xue
Noah Constant
Melvin Johnson
+ ByT5: Towards a token-free future with pre-trained byte-to-byte models 2021 Linting Xue
Aditya Barua
Noah Constant
Rami Al‐Rfou
Sharan Narang
Mihir Kale
Adam Roberts
Colin Raffel
+ Multilingual Synthetic Question and Answer Generation for Cross-Lingual Reading Comprehension. 2020 Siamak Shakeri
Noah Constant
Mihir Kale
Linting Xue
+ mT5: A massively multilingual pre-trained text-to-text transformer 2020 Linting Xue
Noah Constant
Adam P. Roberts
Mihir Kale
Rami Al‐Rfou
Aditya Siddhant
Aditya Barua
Colin Raffel
+ A Social Network Analysis on Blended Courses 2017 Niki Gitinabard
Linting Xue
Collin F. Lynch
Sarah Heckman
Tiffany Barnes
Common Coauthors
Commonly Cited References
Action Title Year Authors # of times referenced
+ PDF Chat SQuAD: 100,000+ Questions for Machine Comprehension of Text 2016 Pranav Rajpurkar
Jian Zhang
Konstantin Lopyrev
Percy Liang
6
+ Multilingual Denoising Pre-training for Neural Machine Translation 2020 Yinhan Liu
Jiatao Gu
Naman Goyal
Xian Li
Sergey Edunov
Marjan Ghazvininejad
Mike Lewis
Luke Zettlemoyer
5
+ On the Cross-lingual Transferability of Monolingual Representations 2020 Mikel Artetxe
Sebastian Ruder
Dani Yogatama
5
+ PDF Chat FILTER: An Enhanced Fusion Method for Cross-lingual Language Understanding 2021 Yuwei Fang
Shuohang Wang
Zhe Gan
Siqi Sun
Jun Liu
4
+ Unsupervised Cross-lingual Representation Learning at Scale 2020 Alexis Conneau
Kartikay Khandelwal
Naman Goyal
Vishrav Chaudhary
Guillaume Wenzek
Francisco GuzmĂĄn
Édouard Grave
Myle Ott
Luke Zettlemoyer
Veselin Stoyanov
4
+ Don’t Stop Pretraining: Adapt Language Models to Domains and Tasks 2020 Suchin Gururangan
Ana Marasović
Swabha Swayamdipta
Kyle Lo
Iz Beltagy
Doug Downey
Noah A. Smith
4
+ XTREME: A Massively Multilingual Multi-task Benchmark for Evaluating Cross-lingual Generalization 2020 Junjie Hu
Sebastian Ruder
Aditya Siddhant
Graham Neubig
Orhan Fırat
Melvin Johnson
4
+ VECO: Variable Encoder-decoder Pre-training for Cross-lingual Understanding and Generation 2021 Fuli Luo
Wei Wang
Jiahao Liu
Yijia Liu
Bin Bi
Songfang Huang
Fei Huang
Luo Si
4
+ PAWS-X: A Cross-lingual Adversarial Dataset for Paraphrase Identification 2019 Yinfei Yang
Yuan Zhang
Chris Tar
Jason Baldridge
3
+ Synthetic QA Corpora Generation with Roundtrip Consistency 2019 Chris Alberti
Daniel Andor
Emily Pitler
Jacob Devlin
Michael Collins
3
+ SentencePiece: A simple and language independent subword tokenizer and detokenizer for Neural Text Processing 2018 Taku Kudo
John T. E. Richardson
3
+ Exploring Fine-tuning Techniques for Pre-trained Cross-lingual Models via Continual Learning 2020 Zihan Liu
Genta Indra Winata
Andrea Madotto
Pascale Fung
3
+ PDF Chat XNLI: Evaluating Cross-lingual Sentence Representations 2018 Alexis Conneau
Ruty Rinott
Guillaume Lample
Adina Williams
Samuel Bowman
Holger Schwenk
Veselin Stoyanov
3
+ Attention is All you Need 2017 Ashish Vaswani
Noam Shazeer
Niki Parmar
Jakob Uszkoreit
Llion Jones
Aidan N. Gomez
Ɓukasz Kaiser
Illia Polosukhin
3
+ PDF Chat Subword Regularization: Improving Neural Network Translation Models with Multiple Subword Candidates 2018 Taku Kudo
3
+ PDF Chat mT5: A Massively Multilingual Pre-trained Text-to-Text Transformer 2021 Linting Xue
Noah Constant
Adam P. Roberts
Mihir Kale
Rami Al‐Rfou
Aditya Siddhant
Aditya Barua
Colin Raffel
3
+ Cross-lingual Language Model Pretraining 2019 Guillaume Lample
Alexis Conneau
3
+ MLQA: Evaluating Cross-lingual Extractive Question Answering 2020 Patrick Lewis
Barlas Oğuz
Ruty Rinott
Sebastian Riedel
Holger Schwenk
3
+ How Much Knowledge Can You Pack Into the Parameters of a Language Model? 2020 Adam P. Roberts
Colin Raffel
Noam Shazeer
2
+ PDF Chat Google’s Multilingual Neural Machine Translation System: Enabling Zero-Shot Translation 2017 Melvin Johnson
Mike Schuster
Quoc V. Le
Maxim Krikun
Yonghui Wu
Zhifeng Chen
Nikhil Thorat
Fernanda Viégas
Martin Wattenberg
Greg S. Corrado
2
+ PhoBERT: Pre-trained language models for Vietnamese 2020 Dat Quoc Nguyen
Anh Tuan Nguyen
2
+ mT5: A massively multilingual pre-trained text-to-text transformer 2020 Linting Xue
Noah Constant
Adam P. Roberts
Mihir Kale
Rami Al‐Rfou
Aditya Siddhant
Aditya Barua
Colin Raffel
2
+ Playing with Words at the National Library of Sweden : Making a Swedish BERT 2021 Martin Malmsten
Love Börjeson
Chris Haffenden
2
+ Unifying Question Answering and Text Classification via Span Extraction. 2019 Nitish Shirish Keskar
Bryan McCann
Caiming Xiong
Richard Socher
2
+ PTT5: Pretraining and validating the T5 model on Brazilian Portuguese data. 2020 Diedre Carmo
Marcos Piau
Israel Campiotti
Rodrigo Nogueira
Roberto Lotufo
2
+ Leveraging Passage Retrieval with Generative Models for Open Domain Question Answering 2020 Gautier Izacard
Édouard Grave
2
+ InfoXLM: An Information-Theoretic Framework for Cross-Lingual Language Model Pre-Training 2020 Zewen Chi
Li Dong
Furu Wei
Nan Yang
Saksham Singhal
Wenhui Wang
Song Xia
Xian-Ling Mao
Heyan Huang
Ming Zhou
2
+ PDF Chat ERNIE-M: Enhanced Multilingual Representation by Aligning Cross-lingual Semantics with Monolingual Corpora 2021 Xuan Ouyang
Shuohuan Wang
Chao Pang
Yu Sun
Hao Tian
Hua Wu
Haifeng Wang
2
+ PDF Chat UNIFIEDQA: Crossing Format Boundaries with a Single QA System 2020 Daniel Khashabi
Sewon Min
Tushar Khot
Ashish Sabharwal
Oyvind Tafjord
Peter E. Clark
Hannaneh Hajishirzi
2
+ Document Ranking with a Pretrained Sequence-to-Sequence Model 2020 Rodrigo Nogueira
Zhiying Jiang
Ronak Pradeep
Jimmy Lin
2
+ BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension 2020 Mike Lewis
Yinhan Liu
Naman Goyal
Marjan Ghazvininejad
Abdelrahman Mohamed
Omer Levy
Veselin Stoyanov
Luke Zettlemoyer
2
+ Improving Massively Multilingual Neural Machine Translation and Zero-Shot Translation 2020 Biao Zhang
Philip Williams
Ivan Titov
Rico Sennrich
2
+ WT5?! Training Text-to-Text Models to Explain their Predictions 2020 Sharan Narang
Colin Raffel
Katherine Lee
Adam P. Roberts
Noah Fiedel
Karishma Malkan
2
+ PDF Chat Universal Language Model Fine-tuning for Text Classification 2018 Jeremy Howard
Sebastian Ruder
2
+ GLU Variants Improve Transformer. 2020 Noam Shazeer
2
+ FlauBERT: Unsupervised Language Model Pre-training for French 2019 Hang Le
LoĂŻc Vial
Jibril Frej
Vincent Segonne
Maximin Coavoux
Benjamin Lecouteux
Alexandre Allauzen
Benoßt Crabbé
Laurent Besacier
Didier Schwab
2
+ BERTje: A Dutch BERT Model 2019 Wietse de Vries
Andreas van Cranenburgh
Arianna Bisazza
Tommaso Caselli
Gertjan van Noord
Malvina Nissim
2
+ CamemBERT: a Tasty French Language Model 2020 Louis Martin
Benjamin MĂŒller
Pedro Ortiz Suarez
Yoann Dupont
Laurent Romary
Éric Villemonte de la Clergerie
Djamé Seddah
BenoĂźt Sagot
2
+ PDF Chat Evaluating the Cross-Lingual Effectiveness of Massively Multilingual Neural Machine Translation 2020 Aditya Siddhant
Melvin Johnson
Henry Tsai
Naveen Ari
Jason Riesa
Ankur Bapna
Orhan Fırat
Karthik Raman
2
+ BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension. 2019 Mike Lewis
Yinhan Liu
Naman Goyal
Marjan Ghazvininejad
Abdelrahman Mohamed
Omer Levy
Ves Stoyanov
Luke Zettlemoyer
2
+ The Natural Language Decathlon: Multitask Learning as Question Answering 2018 Bryan McCann
Nitish Shirish Keskar
Caiming Xiong
Richard Socher
2
+ Learning to Answer by Learning to Ask: Getting the Best of GPT-2 and BERT Worlds 2019 Tassilo Klein
Moin Nabi
2
+ RobBERT: a Dutch RoBERTa-based Language Model 2020 Pieter Delobelle
Thomas Winters
Bettina Berendt
2
+ RoBERTa: A Robustly Optimized BERT Pretraining Approach 2019 Yinhan Liu
Myle Ott
Naman Goyal
Jingfei Du
Mandar Joshi
Danqi Chen
Omer Levy
Mike Lewis
Luke Zettlemoyer
Veselin Stoyanov
2
+ Well-Read Students Learn Better: On the Importance of Pre-training Compact Models 2019 Iulia Turc
Ming‐Wei Chang
Kenton Lee
Kristina Toutanova
2
+ Massively Multilingual Neural Machine Translation in the Wild: Findings and Challenges 2019 Naveen Arivazhagan
Ankur Bapna
Orhan Fırat
Dmitry Lepikhin
Melvin Johnson
Maxim Krikun
Mia Xu Chen
Yuan Cao
George Foster
Colin Cherry
2
+ MLQA: Evaluating Cross-lingual Extractive Question Answering 2019 Patrick Lewis
Barlas Oğuz
Ruty Rinott
Sebastian Riedel
Holger Schwenk
2
+ PDF Chat Multilingual Denoising Pre-training for Neural Machine Translation 2020 Yinhan Liu
Jiatao Gu
Naman Goyal
Xian Li
Sergey Edunov
Marjan Ghazvininejad
Mike Lewis
Luke Zettlemoyer
2
+ MASS: Masked Sequence to Sequence Pre-training for Language Generation 2019 Kaitao Song
Xu Tan
Tao Qin
Jianfeng Lu
Tie‐Yan Liu
2
+ Pre-training via Paraphrasing 2020 Mike Lewis
Marjan Ghazvininejad
Gargi Ghosh
Armen Aghajanyan
Sida Wang
Luke Zettlemoyer
2