Ask a Question

Prefer a chat interface with context about you and your work?

Expansion of the Kullback-Leibler Divergence, and a New Class of Information Metrics

Expansion of the Kullback-Leibler Divergence, and a New Class of Information Metrics

Inferring and comparing complex, multivariable probability density functions is fundamental to problems in several fields, including probabilistic learning, network theory, and data analysis. Classification and prediction are the two faces of this class of problem. This study takes an approach that simplifies many aspects of these problems by presenting a …