Ask a Question

Prefer a chat interface with context about you and your work?

Towards quantifying information flows: relative entropy in deep neural networks and the renormalization group

Towards quantifying information flows: relative entropy in deep neural networks and the renormalization group

We investigate the analogy between the renormalization group (RG) and deep neural networks, wherein subsequent layers of neurons are analogous to successive steps along the RG. In particular, we quantify the flow of information by explicitly computing the relative entropy or Kullback-Leibler divergence in both the one- and two-dimensional Ising …