Minimax Rates of Entropy Estimation on Large Alphabets via Best Polynomial Approximation
Minimax Rates of Entropy Estimation on Large Alphabets via Best Polynomial Approximation
Consider the problem of estimating the Shannon entropy of a distribution over k elements from n independent samples. We show that the minimax mean-square error is within the universal multiplicative constant factors of (k/n log k) <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">2</sup> t log <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">2</sup> k/n if n exceeds a constant …