Ask a Question

Prefer a chat interface with context about you and your work?

Improving Multi-fidelity Optimization with a Recurring Learning Rate for Hyperparameter Tuning

Improving Multi-fidelity Optimization with a Recurring Learning Rate for Hyperparameter Tuning

Despite the evolution of Convolutional Neural Networks (CNNs), their performance is surprisingly dependent on the choice of hyperparameters. However, it remains challenging to efficiently explore large hyperparameter search space due to the long training times of modern CNNs. Multi-fidelity optimization enables the exploration of more hyperparameter configurations given budget by …