Type: Article
Publication Date: 2022-01-01
Citations: 1
DOI: https://doi.org/10.1109/tsp.2022.3144939
We consider the problem of Bayesian optimization of a one-dimensional Brownian motion in which the <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><tex-math notation="LaTeX">$T$</tex-math></inline-formula> adaptively chosen observations are corrupted by Gaussian noise. We show that the smallest possible expected cumulative regret and the smallest possible expected simple regret scale as <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><tex-math notation="LaTeX">$\Omega (\sigma \sqrt {T / \log (T)}) \cap \mathcal {O}(\sigma \sqrt {T} \cdot \log T)$</tex-math></inline-formula> and <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><tex-math notation="LaTeX">$\Omega (\sigma / \sqrt {T \log (T)}) \cap \mathcal {O}(\sigma \log T / \sqrt {T})$</tex-math></inline-formula> respectively, where <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><tex-math notation="LaTeX">$\sigma ^2$</tex-math></inline-formula> is the noise variance. Thus, our upper and lower bounds are tight up to a factor of <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><tex-math notation="LaTeX">$\mathcal {O} ((\log T)^{1.5})$</tex-math></inline-formula> . The upper bound uses an algorithm based on confidence bounds and the Markov property of Brownian motion (among other useful properties), and the lower bound is based on a reduction to binary hypothesis testing.
Action | Title | Year | Authors |
---|---|---|---|
+ | On Lower Bounds for Standard and Robust Gaussian Process Bandit Optimization | 2020 |
Xu Cai Jonathan Scarlett |