Type: Article
Publication Date: 2021-01-01
Citations: 1
DOI: https://doi.org/10.1214/21-ejs1916
Edge-exchangeable probabilistic network models generate edges as an i.i.d. sequence from a discrete measure, providing a simple means for statistical inference of latent network properties. The measure is often constructed using the self-product of a realization from a Bayesian nonparametric (BNP) discrete prior; but unlike in standard BNP models, the self-product measure prior is not conjugate the likelihood, hindering the development of exact simulation and inference algorithms. Approximation via finite truncation of the discrete measure is a straightforward alternative, but incurs an unknown approximation error. In this paper, we develop methods for forward simulation and posterior inference in random self-product-measure models based on truncation, and provide theoretical guarantees on the quality of the results as a function of the truncation level. The techniques we present are general and extend to the broader class of discrete Bayesian nonparametric models.
Action | Title | Year | Authors |
---|---|---|---|
+ PDF Chat | Posterior Sampling From Truncated Ferguson-Klass Representation of Normalised Completely Random Measure Mixtures | 2024 |
Junyi Zhang Angelos Dassios |