Ask a Question

Prefer a chat interface with context about you and your work?

Information inequalities and concentration of measure

Information inequalities and concentration of measure

We derive inequalities of the form $\Delta (P, Q) \leq H(P|R) + H(Q|R)$ which hold for every choice of probability measures P, Q, R, where $H(P|R)$ denotes the relative entropy of $P$ with respect to $R$ and $\Delta (P, Q)$ stands for a coupling type "distance" between $P$ and $Q$. …