Ask a Question

Prefer a chat interface with context about you and your work?

Honest Exploration of Intractable Probability Distributions via Markov Chain Monte Carlo

Honest Exploration of Intractable Probability Distributions via Markov Chain Monte Carlo

Two important questions that must be answered whenever a Markov chain Monte Carlo (MCMC) algorithm is used are (Q1) What is an appropriate burn-in? and (Q2) How long should the sampling continue after burn-in?Developing rigorous answers to these questions presently requires a detailed study of the convergence properties of the …