Minimax Optimal Estimation of KL Divergence for Continuous Distributions

Type: Preprint

Publication Date: 2020-01-01

Citations: 0

DOI: https://doi.org/10.48550/arxiv.2002.11599

Locations

  • arXiv (Cornell University) - View
  • DataCite API - View

Similar Works

Action Title Year Authors
+ Minimax Optimal Estimation of KL Divergence for Continuous Distributions 2020 Puning Zhao
Lifeng Lai
+ Analysis of K Nearest Neighbor KL Divergence Estimation for Continuous Distributions 2020 Puning Zhao
Lifeng Lai
+ Estimation of KL Divergence: Optimal Minimax Rate 2018 Yuheng Bu
Shaofeng Zou
Yingbin Liang
Venugopal V. Veeravalli
+ Bias reduction and metric learning for nearest-neighbor estimation of kullback-leibler divergence 2014 Yung‐Kyun Noh
Masashi Sugiyama
Song Liu
Marthinus Christoffel du Plessis
Frank C. Park
Daniel D. Lee
+ Kullback-Leibler divergence estimation of continuous distributions 2008 Fernando Pérez‐Cruz
+ Estimation of KL Divergence: Optimal Minimax Rate 2016 Yuheng Bu
Shaofeng Zou
Yingbin Liang
Venugopal V. Veeravalli
+ Exact Expressions for Kullback–Leibler Divergence for Univariate Distributions 2024 Victor Mooto Nawa
Saralees Nadarajah
+ Minimax Rate-optimal Estimation of KL Divergence between Discrete Distributions. 2016 Yanjun Han
Jiantao Jiao
Tsachy Weissman
+ Analysis of k-Nearest Neighbor Distances with Application to Entropy Estimation 2016 Shashank Singh
Barnabás Póczos
+ Analysis of KNN Information Estimators for Smooth Distributions 2019 Puning Zhao
Lifeng Lai
+ Analysis of KNN Information Estimators for Smooth Distributions 2018 Puning Zhao
Lifeng Lai
+ Minimax Estimation of KL Divergence between Discrete Distributions. 2016 Yanjun Han
Jiantao Jiao
Tsachy Weissman
+ PDF Chat Analysis of KNN Information Estimators for Smooth Distributions 2018 Puning Zhao
Lifeng Lai
+ Estimation of Kullback–Leibler Divergence by Local Likelihood 2006 Young Lee
Byeong U. Park
+ Generalized Kullback-Leibler Divergence 2017 Josip ‎Pečarić
Dora Pokaz
+ Non-Asymptotic Performance Guarantees for Neural Estimation of $\mathsf{f}$-Divergences 2021 Sreejith Sreekumar
Zhengxin Zhang
Ziv Goldfeld
+ Non-Asymptotic Performance Guarantees for Neural Estimation of $\mathsf{f}$-Divergences 2021 Sreejith Sreekumar
Zhengxin Zhang
Ziv Goldfeld
+ On the Properties of Kullback-Leibler Divergence Between Multivariate Gaussian Distributions 2021 Yufeng Zhang
Wanwei Liu
Zhenbang Chen
Ji Wang
Kenli Li
+ Bias Reduction and Metric Learning for Nearest-Neighbor Estimation of Kullback-Leibler Divergence 2018 Yung‐Kyun Noh
Masashi Sugiyama
Song Liu
Marthinus Christoffel du Plessis
Frank C. Park
Daniel D. Lee
+ On the Properties of Kullback-Leibler Divergence Between Gaussians. 2021 Yufeng Zhang
Wanwei Liu
Zhenbang Chen
Kenli Li
Ji Wang

Works That Cite This (0)

Action Title Year Authors