Ask a Question

Prefer a chat interface with context about you and your work?

Small Errors in Random Zeroth-Order Optimization Are Imaginary

Small Errors in Random Zeroth-Order Optimization Are Imaginary

The vast majority of zeroth order optimization methods try to imitate first order methods via some smooth approximation of the gradient. Here, the smaller the smoothing parameter, the smaller the gradient approximation error. We show that for the majority of zeroth order methods this smoothing parameter can however not be …