Anti-Distillation: Improving reproducibility of deep networks

Type: Preprint

Publication Date: 2020-01-01

Citations: 4

DOI: https://doi.org/10.48550/arxiv.2010.09923

Locations

  • arXiv (Cornell University) - View
  • DataCite API - View

Similar Works

Action Title Year Authors
+ Anti-Distillation: Improving Reproducibility of Deep Networks 2021 Gil I. Shamir
Lorenzo Coviello
+ Hydra: Preserving Ensemble Diversity for Model Distillation 2020 Linh Tran
Bastiaan S. Veeling
Kevin A. Roth
Jakub Świątkowski
Joshua V. Dillon
Jasper Snoek
Stephan Mandt
Tim Salimans
Sebastian Nowozin
Rodolphe Jenatton
+ Towards Understanding Ensemble, Knowledge Distillation and Self-Distillation in Deep Learning 2020 Zeyuan Allen-Zhu
Yuanzhi Li
+ Functional Ensemble Distillation 2022 Coby Penso
Idan Achituve
Ethan Fetaya
+ Boosting the Cross-Architecture Generalization of Dataset Distillation through an Empirical Study 2023 Lirui Zhao
Yuxin Zhang
Mingbao Lin
Fei Chao
Rongrong Ji
+ On the Orthogonality of Knowledge Distillation with Other Techniques: From an Ensemble Perspective 2020 SeongUk Park
KiYoon Yoo
Nojun Kwak
+ Churn Reduction via Distillation 2021 Heinrich Jiang
Harikrishna Narasimhan
Dara Bahri
Andrew Cotter
Afshin Rostamizadeh
+ PDF Chat Churn Reduction via Distillation 2021 Heinrich Jiang
Harikrishna Narasimhan
Dara Bahri
Andrew Cotter
Afshin Rostamizadeh
+ Towards Mitigating Architecture Overfitting in Dataset Distillation 2023 Xuyang Zhong
Chen Liu
+ Ensemble Distribution Distillation 2020 Andrey Malinin
Bruno Mlodozeniec
Mark Gales
+ Ensemble Distribution Distillation 2019 Andrey Malinin
Bruno Mlodozeniec
Mark Gales
+ PDF Chat A General Framework for Ensemble Distribution Distillation 2020 Jakob Lindqvist
Amanda Olmin
Fredrik Lindsten
Lennart Svensson
+ On the Reproducibility of Neural Network Predictions 2021 Srinadh Bhojanapalli
Michael J. Wilber
Andreas Veit
Ankit Singh Rawat
Seung‐Yeon Kim
Aditya Krishna Menon
Sanjiv Kumar
+ DICE: Diversity in Deep Ensembles via Conditional Redundancy Adversarial Estimation 2021 Alexandre Ramé
Matthieu Cord
+ Using Early Readouts to Mediate Featural Bias in Distillation 2023 Rishabh Tiwari
Durga Sivasubramanian
Anmol Mekala
Ganesh Ramakrishnan
Pradeep Shenoy
+ PDF Chat Using Early Readouts to Mediate Featural Bias in Distillation 2024 Rishabh Tiwari
Durga Sivasubramanian
Anmol Mekala
Ganesh Ramakrishnan
Pradeep Shenoy
+ Simple Regularisation for Uncertainty-Aware Knowledge Distillation 2022 Martin Ferianc
Miguel Tréfaut Rodrigues
+ Deep Ensembles Work, But Are They Necessary? 2022 Taiga Abe
E. Kelly Buchanan
Geoff Pleiss
Richard S. Zemel
John P. Cunningham
+ EnsembleNet: End-to-End Optimization of Multi-headed Models 2019 Hanhan Li
Joe Yue-Hei Ng
Apostol Natsev
+ Self-Distribution Distillation: Efficient Uncertainty Estimation 2022 Yassir Fathullah
Mark Gales

Works That Cite This (0)

Action Title Year Authors

Works Cited by This (0)

Action Title Year Authors