Ask a Question

Prefer a chat interface with context about you and your work?

Entropy and Divergence Associated with Power Function and the Statistical Application

Entropy and Divergence Associated with Power Function and the Statistical Application

In statistical physics, Boltzmann-Shannon entropy provides good understanding for the equilibrium states of a number of phenomena. In statistics, the entropy corresponds to the maximum likelihood method, in which Kullback-Leibler divergence connects Boltzmann-Shannon entropy and the expected log-likelihood function. The maximum likelihood estimation has been supported for the optimal performance, …