Ask a Question

Prefer a chat interface with context about you and your work?

Training Neural Samplers with Reverse Diffusive KL Divergence

Training Neural Samplers with Reverse Diffusive KL Divergence

Training generative models to sample from unnormalized density functions is an important and challenging task in machine learning. Traditional training methods often rely on the reverse Kullback-Leibler (KL) divergence due to its tractability. However, the mode-seeking behavior of reverse KL hinders effective approximation of multi-modal target distributions. To address this, …