Projects
Reading
People
Chat
SU\G
(𝔸)
/K·U
Projects
Reading
People
Chat
Sign Up
Light
Dark
System
Haiqi Jiang
Follow
Share
Generating author description...
All published works
Action
Title
Year
Authors
+
PDF
Chat
Sci-CoT: Leveraging Large Language Models for Enhanced Knowledge Distillation in Small Models for Scientific QA
2023
Yuhan Ma
Chenyou Fan
Haiqi Jiang
+
Sci-CoT: Leveraging Large Language Models for Enhanced Knowledge Distillation in Small Models for Scientific QA
2023
Yuhan Ma
Haiqi Jiang
Chenyou Fan
Common Coauthors
Coauthor
Papers Together
Yuhan Ma
2
Chenyou Fan
2
Commonly Cited References
Action
Title
Year
Authors
# of times referenced
+
Patient Knowledge Distillation for BERT Model Compression
2019
Siqi Sun
Yu Cheng
Zhe Gan
Jun Liu
1
+
TinyBERT: Distilling BERT for Natural Language Understanding
2020
Xiaoqi Jiao
Yichun Yin
Lifeng Shang
Xin Jiang
Xiao Dong Chen
Linlin Li
Fang Wang
Qun Liu
1
+
PDF
Chat
Distilling Step-by-Step! Outperforming Larger Language Models with Less Training Data and Smaller Model Sizes
2023
Cheng-Yu Hsieh
Chun‐Liang Li
Chih‐Kuan Yeh
Hootan Nakhost
Yasuhisa Fujii
Alex Ratner
Ranjay Krishna
Chen-Yu Lee
Tomas Pfister
1
+
Large Language Models Are Reasoning Teachers
2023
Namgyu Ho
Laura Schmid
Se-Young Yun
1
+
Teaching Small Language Models to Reason
2023
Lucie Charlotte Magister
Jonathan Mallinson
Jakub Adámek
Eric Malmi
Aliaksei Severyn
1
+
PDF
Chat
Federated Prompting and Chain-of-Thought Reasoning for Improving LLMs Answering
2023
Xiangyang Liu
Tianqi Pang
Chenyou Fan
1