Ask a Question

Prefer a chat interface with context about you and your work?

Entropy and Information jump for log-concave vectors

Entropy and Information jump for log-concave vectors

We extend the result of Ball and Nguyen on the jump of entropy under convolution for log-concave random vectors. We show that the result holds for any pair of vectors (not necessarily identically distributed) and that a similar inequality holds for the Fisher information, thus providing a quantitative Blachmann–Stam inequality.