1. Introduction
Let $(S, \mathscr{B}(S))$ be a Polish (state) space. Consider a (time-homogeneous) Markov chain on $(S, \mathscr{B}(S))$ as a family of probability measures on $S^ \infty$. Namely, on the measurable space ${(\bar \Omega,\mathscr{F}) = (S^ \infty , \mathscr{B}(S ^\infty ))}$ consider a family of probability measures $\{P_s \}_{s \in S}$ such that for the coordinate mappings
the process $X \coloneqq \{X_n \}_{n \in \mathbb{Z} _+ }$ is a Markov chain such that for all $s \in S$
Here $A_j \in \mathscr{B} (S)$, $m_j \in \mathbb{N}$, $ l \in \mathbb{N}$, $\mathscr{F} _n = \sigma \{ X_1,...,X_n \}$. The space S is separable, hence there exists a transition probability kernel $Q\,:\,S \times \mathscr{B} (S) \rightarrow [0,1]$ such that
Consider a transformation of the chain X, $Y_n = f(X_n)$, where $f\,:\,S\to \mathbb{R} $ is a Borel-measurable function. Lemma 4.1 in [Reference B., P., K., L. and O.BDK+18] gives sufficient conditions for $Y = \{Y_n \}_{n \in \mathbb{Z} _+ }$ to be a Markov chain. The original erroneous Lemma 4.1 had the following formulation.
Lemma 4.1 old. Assume that for any bounded Borel function $h\,:\,S\rightarrow S$
Then Y is a Markov chain.
Lemma 4.1 and the remark succeeding it in [Reference B., P., K., L. and O.BDK+18] should instead read as follows.
Lemma 4.1. Assume that for any bounded Borel function $h\,:\,S\rightarrow S$
Then Y is a Markov chain.
Remark 4.1. Condition (1) is the equality of distributions of $Y_1$ under two different measures, $\mathbb{P}_s$ and $ \mathbb{P}_q$.
Note that the statement of Lemma 4.1 old is actually correct in the sense that it is true, but it is very weak. In particular, in the proof of Theorem 2.6 in [Reference B., P., K., L. and O.BDK+18] Lemma 4.1 is required because Lemma 4.1 old is not sufficient.
Proof of Lemma 4.1. For the natural filtrations of the processes X and Y we have an inclusion
since Y is a function of X. For $n\in \mathbb{N}$ and bounded Borel functions $h_j\,:\,\mathbb{R} \rightarrow \mathbb{R}$, $j=1,2,...,n$,
To transform the last integral, we introduce a new kernel: for $y \in f(S)$ chose $x \in S$ with $f(x) = y$, and then for $B \in \mathscr{B}(\mathbb{R})$ define
The expression on the right-hand side does not depend on the choice of x because of (1). To make the kernel $\overline Q$ defined on $\mathbb{R} \times \mathscr {B} (\mathbb{R}) $, we set
Then, setting $z_{n} = f(x_{n})$, we obtain from the change of variables formula for the Lebesgue integral that
Likewise, setting $z_{n-1} = f(x_{n-1})$, we get
Proceeding further, we obtain
where $z_0 = f (x _0 )$.
Thus,
This equality and (2) imply that Y is a Markov chain.□