Ask a Question

Prefer a chat interface with context about you and your work?

๐œ†-continuous Markov chains

๐œ†-continuous Markov chains

0. Introduction and summary. We call a Markov operator P which has a representation P(x,A) = fAp(x,y)A(dy) with p(x,y) bivariate measurable a A-continuous Markov operator. It is a special kind of A-measurable Markov operator of E. Hopf. If the state space is discrete, every Markov operator is A-continuous where A โ€ฆ