Hyperparameter Optimization Is Deceiving Us, and How to Stop It

Type: Preprint

Publication Date: 2021-01-01

Citations: 7

DOI: https://doi.org/10.48550/arxiv.2102.03034

Locations

  • arXiv (Cornell University) - View
  • DataCite API - View

Similar Works

Action Title Year Authors
+ Hyperparameter Optimization Is Deceiving Us, and How to Stop It 2021 A. Feder Cooper
Yucheng Lu
Christopher De
+ PDF Chat Hyperparameter optimization: Foundations, algorithms, best practices, and open challenges 2023 Bernd Bischl
Martin Binder
Michel Lang
Tobias Pielok
Jakob Richter
Stefan Coors
Janek Thomas
Theresa Ullmann
Marc Becker
Anne‐Laure Boulesteix
+ Hyperparameter Optimization: Foundations, Algorithms, Best Practices and Open Challenges 2021 Bernd Bischl
Martin Binder
Michel Lang
Tobias Pielok
Jakob Richter
Stefan Coors
Janek Thomas
Theresa Ullmann
Marc Becker
Anne‐Laure Boulesteix
+ Automated Benchmark-Driven Design and Explanation of Hyperparameter Optimizers 2021 Julia Moosbauer
Martin Binder
Lennart Schneider
Florian Pfisterer
Marc Becker
Michel Lang
Lars Kotthoff
Bernd Bischl
+ PDF Chat Automated Benchmark-Driven Design and Explanation of Hyperparameter Optimizers 2021 Julia Moosbauer
Martin Binder
Lennart Schneider
Florian Pfisterer
Marc Becker
Michel Lang
Lars Kotthoff
Bernd Bischl
+ Expanding search in the space of empirical ML 2018 Bronwyn Woods
+ Automatic Termination for Hyperparameter Optimization 2021 Anastasia Makarova
Huibin Shen
Valerio Perrone
Aaron Klein
Jean Baptiste Faddoul
Andreas Krause
Matthias Seeger
Cédric Archambeau
+ PDF Chat Automated Benchmark-Driven Design and Explanation of Hyperparameter Optimizers 2022 Julia Moosbauer
Martin Binder
Lennart Schneider
Florian Pfisterer
Marc Becker
Michel Lang
Lars Kotthoff
Bernd Bischl
+ PDF Chat Weighted Sampling for Combined Model Selection and Hyperparameter Tuning 2020 Dimitrios Sarigiannis
Thomas Parnell
Haralampos Pozidis
+ Weighted Sampling for Combined Model Selection and Hyperparameter Tuning 2019 Dimitrios Sarigiannis
T. A. Parnell
Haralampos Pozidis
+ Weighted Sampling for Combined Model Selection and Hyperparameter Tuning 2019 Dimitrios Sarigiannis
Thomas Parnell
Haralampos Pozidis
+ PDF Chat Learning multiple defaults for machine learning algorithms 2021 Florian Pfisterer
Jan N. van Rijn
Philipp Probst
Andreas Müller
Bernd Bischl
+ SMAC3: A Versatile Bayesian Optimization Package for Hyperparameter Optimization 2021 Marius Lindauer
Katharina Eggensperger
Matthias Feurer
André Biedenkapp
Difan Deng
Carolin Benjamins
Tim Ruhopf
René Sass
Frank Hutter
+ Practitioner Motives to Select Hyperparameter Optimization Methods 2022 Niklas Hasebrook
Felix Morsbach
Niclas Kannengießer
Jörg Franke
Frank Hutter
Ali Sunyaev
+ PyHopper -- Hyperparameter optimization 2022 Mathias Lechner
Ramin Hasani
Philipp Neubauer
Sophie Neubauer
Daniela Rus
+ SMAC3: A Versatile Bayesian Optimization Package for Hyperparameter Optimization. 2021 Marius Lindauer
Katharina Eggensperger
Matthias Feurer
André Biedenkapp
Difan Deng
Carolin Benjamins
René Sass
Frank Hutter
+ FLO: Fast and Lightweight Hyperparameter Optimization for AutoML. 2019 Chi Wang
Qingyun Wu
+ MANGO: A Python Library for Parallel Hyperparameter Tuning 2020 Sandeep Singh Sandha
Mohit Aggarwal
I. A. Fedorov
Mani Srivastava
+ HPO-B: A Large-Scale Reproducible Benchmark for Black-Box HPO based on OpenML 2021 Sebastian Pineda Arango
Hadi S. Jomaa
Martin Wistuba
Josif Grabocka
+ HPO-B: A Large-Scale Reproducible Benchmark for Black-Box HPO based on OpenML 2021 Sebastian Pineda Arango
Hadi S. Jomaa
Martin Wistuba
Josif Grabocka

Works Cited by This (0)

Action Title Year Authors