๐-continuous Markov chains
๐-continuous Markov chains
0. Introduction and summary. We call a Markov operator P which has a representation P(x,A) = fAp(x,y)A(dy) with p(x,y) bivariate measurable a A-continuous Markov operator. It is a special kind of A-measurable Markov operator of E. Hopf. If the state space is discrete, every Markov operator is A-continuous where A โฆ