Projects
Reading
People
Chat
SU\G
(𝔸)
/K·U
Projects
Reading
People
Chat
Sign Up
Light
Dark
System
Keming Chen
Follow
Share
Generating author description...
All published works
Action
Title
Year
Authors
+
What kind of surface is required for algal adhesion: impact of surface properties on microalgal cell–solid substrate interactions
2021
Weida Zeng
Keming Chen
Yun Huang
Ao Xia
Xianqing Zhu
Qiang Liao
Xianqing Zhu
+
Mengzi: Towards Lightweight yet Ingenious Pre-trained Models for Chinese
2021
Zhuosheng Zhang
Hanqing Zhang
Keming Chen
Yuhang Guo
Jingyun Hua
Yulong Wang
Ming Zhou
Common Coauthors
Coauthor
Papers Together
Xianqing Zhu
1
Ao Xia
1
Zhuosheng Zhang
1
Jingyun Hua
1
Qiang Liao
1
Yulong Wang
1
Ming Zhou
1
Hanqing Zhang
1
Yuhang Guo
1
Yun Huang
1
Weida Zeng
1
Commonly Cited References
Action
Title
Year
Authors
# of times referenced
+
A large annotated corpus for learning natural language inference
2015
Samuel R. Bowman
Gabor Angeli
Christopher Potts
Christopher D. Manning
1
+
PDF
Chat
Large-Scale Datasets for Going Deeper in Image Understanding
2019
Jiahong Wu
Zheng He
Bo Zhao
Yixin Li
Baoming Yan
Rui Liang
Wenjia Wang
Shipei Zhou
Guosen Lin
Yanwei Fu
1
+
PDF
Chat
XNLI: Evaluating Cross-lingual Sentence Representations
2018
Alexis Conneau
Ruty Rinott
Guillaume Lample
Adina Williams
Samuel Bowman
Holger Schwenk
Veselin Stoyanov
1
+
Probing Prior Knowledge Needed in Challenging Chinese Machine Reading Comprehension.
2019
Kai Sun
Dian Yu
Dong Yu
Claire Cardie
1
+
ChID: A Large-scale Chinese IDiom Dataset for Cloze Test
2019
Chujie Zheng
Minlie Huang
Aixin Sun
1
+
XNLI: Evaluating Cross-lingual Sentence Representations
2018
Alexis Conneau
Guillaume Lample
Ruty Rinott
Adina Williams
Samuel R. Bowman
Holger Schwenk
Veselin Stoyanov
1
+
Deep Contextualized Word Representations
2018
Matthew E. Peters
Mark E Neumann
Mohit Iyyer
Matt Gardner
Christopher Clark
Kenton Lee
Luke Zettlemoyer
1
+
The RepEval 2017 Shared Task: Multi-Genre Natural Language Inference with Sentence Representations
2017
Nikita Nangia
Adina Williams
Angeliki Lazaridou
Samuel Bowman
1
+
A Broad-Coverage Challenge Corpus for Sentence Understanding through Inference
2018
Adina Williams
Nikita Nangia
Samuel Bowman
1
+
RoBERTa: A Robustly Optimized BERT Pretraining Approach
2019
Yinhan Liu
Myle Ott
Naman Goyal
Jingfei Du
Mandar Joshi
Danqi Chen
Omer Levy
Mike Lewis
Luke Zettlemoyer
Veselin Stoyanov
1
+
XLNet: Generalized Autoregressive Pretraining for Language Understanding
2019
Zhilin Yang
Zihang Dai
Yiming Yang
Jaime Carbonell
Ruslan Salakhutdinov
Quoc V. Le
1
+
NEZHA: Neural Contextualized Representation for Chinese Language Understanding
2019
Victor Junqiu Wei
Xiaozhe Ren
Xiaoguang Li
Wenyong Huang
Yi Liao
Yasheng Wang
Jiashu Lin
Xin Jiang
Xiao Chen
Qun Liu
1
+
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
2019
Victor Sanh
Lysandre Debut
Julien Chaumond
Thomas Wolf
1
+
Investigating Prior Knowledge for Challenging Chinese Machine Reading Comprehension
2019
Kai Sun
Dian Yu
Dong Yu
Claire Cardie
1
+
Large Batch Optimization for Deep Learning: Training BERT in 76 minutes
2019
Yang You
Jing Li
Sashank J. Reddi
Jonathan Hseu
Sanjiv Kumar
Srinadh Bhojanapalli
Xiaodan Song
James Demmel
Kurt Keutzer
Cho‐Jui Hsieh
1
+
ALBERT: A Lite BERT for Self-supervised Learning of Language Representations
2019
Zhenzhong Lan
Mingda Chen
Sebastian Goodman
Kevin Gimpel
Piyush Sharma
Radu Soricut
1
+
PDF
Chat
Q-BERT: Hessian Based Ultra Low Precision Quantization of BERT
2020
Sheng Shen
Zhen Dong
Jiayu Ye
Linjian Ma
Zhewei Yao
Amir Gholami
Michael W. Mahoney
Kurt Keutzer
1
+
PDF
Chat
What BERT Is Not: Lessons from a New Suite of Psycholinguistic Diagnostics for Language Models
2020
Allyson Ettinger
1
+
MiniLM: Deep Self-Attention Distillation for Task-Agnostic Compression of Pre-Trained Transformers
2020
Wenhui Wang
Furu Wei
Li Dong
Hangbo Bao
Nan Yang
Ming Zhou
1
+
PDF
Chat
SpanBERT: Improving Pre-training by Representing and Predicting Spans
2020
Mandar Joshi
Danqi Chen
Yinhan Liu
Daniel S. Weld
Luke Zettlemoyer
Omer Levy
1
+
BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
2020
Mike Lewis
Yinhan Liu
Naman Goyal
Marjan Ghazvininejad
Abdelrahman Mohamed
Omer Levy
Veselin Stoyanov
Luke Zettlemoyer
1
+
SMART: Robust and Efficient Fine-Tuning for Pre-trained Natural Language Models through Principled Regularized Optimization
2020
Haoming Jiang
Pengcheng He
Weizhu Chen
Xiaodong Liu
Jianfeng Gao
Tuo Zhao
1
+
Compressing BERT: Studying the Effects of Weight Pruning on Transfer Learning
2020
Mitchell Gordon
Kevin Duh
Nicholas Andrews
1
+
OCNLI: Original Chinese Natural Language Inference
2020
Hai Hu
Kyle Richardson
Liang Xu
Lu Li
Sandra Kuebler
Lawrence S. Moss
1
+
OCNLI: Original Chinese Natural Language Inference
2020
Hai Hu
Kyle Richardson
Liang Xu
Lu Li
Sandra Kübler
Lawrence S. Moss
1
+
BERT-of-Theseus: Compressing BERT by Progressive Module Replacing
2020
Canwen Xu
Wangchunshu Zhou
Tao Ge
Furu Wei
Ming Zhou
1
+
Revisiting Pre-Trained Models for Chinese Natural Language Processing
2020
Yiming Cui
Wanxiang Che
Ting Liu
Bing Qin
Shijin Wang
Guoping Hu
1
+
TinyBERT: Distilling BERT for Natural Language Understanding
2020
Xiaoqi Jiao
Yichun Yin
Lifeng Shang
Xin Jiang
Xiao Dong Chen
Linlin Li
Fang Wang
Qun Liu
1
+
A Span-Extraction Dataset for Chinese Machine Reading Comprehension
2019
Yiming Cui
Ting Liu
Wanxiang Che
Xiao Li
Zhipeng Chen
Wentao Ma
Shijin Wang
Guoping Hu
1
+
PDF
Chat
SG-Net: Syntax Guided Transformer for Language Representation
2020
Zhuosheng Zhang
Yuwei Wu
Junru Zhou
Sufeng Duan
Hai Zhao
Rui Wang
1
+
PDF
Chat
A Primer in BERTology: What We Know About How BERT Works
2020
Anna Rogers
Olga Kovaleva
Anna Rumshisky
1
+
PanGu-$α$: Large-scale Autoregressive Pretrained Chinese Language Models with Auto-parallel Computation
2021
Wei Zeng
Xiaozhe Ren
Teng Su
Hui Wang
Yi Liao
Zhiwei Wang
Xin Jiang
ZhenZhang Yang
Kaisheng Wang
Xiaoda Zhang
1
+
PDF
Chat
VinVL: Revisiting Visual Representations in Vision-Language Models
2021
Pengchuan Zhang
Xiujun Li
Xiaowei Hu
Jianwei Yang
Lei Zhang
Lijuan Wang
Yejin Choi
Jianfeng Gao
1
+
Syntax-Enhanced Pre-trained Model
2021
Zenan Xu
Daya Guo
Duyu Tang
Qinliang Su
Linjun Shou
Ming Gong
Wanjun Zhong
Xiaojun Quan
Daxin Jiang
Nan Duan
1
+
Retrospective Reader for Machine Reading Comprehension
2020
Zhuosheng Zhang
Junjie Yang
Hai Zhao
1
+
CPM: A large-scale generative Chinese Pre-trained language model
2021
Zhengyan Zhang
Xu Han
Hao Zhou
Pei Ke
Yuxian Gu
Deming Ye
Yujia Qin
Yusheng Su
Haozhe Ji
Jian Guan
1