A Markov chain on a measurable state space is a discrete-time-homogeneous Markov chain with a measurable space as state space.
The definition of Markov chains has evolved during the 20th century.
In 1953 the term Markov chain was used for stochastic processes with discrete or continuous index set, living on a countable or finite state space, see Doob.
[2] Since the late 20th century it became more popular to consider a Markov chain as a stochastic process with discrete index set, living on a measurable state space.
a measurable space and with
a Markov kernel with source and target
A stochastic process
is called a time homogeneous Markov chain with Markov kernel
and start distribution
One can construct for any Markov kernel and any probability measure an associated Markov chain.
the Lebesgue integral as
is a Dirac measure in
, we denote for a Markov kernel
with starting distribution
the associated Markov chain as
We have for any measurable function
the following relation:[4] For a Markov kernel
with starting distribution
one can introduce a family of Markov kernels
For the associated Markov chain
one obtains A probability measure
is called stationary measure of a Markov kernel
if holds for any
denotes the Markov chain according to a Markov kernel
with stationary measure
have the same probability distribution, namely: for any
A Markov kernel
is called reversible according to a probability measure
if holds for any
must be a stationary measure of