Skip to main content

Markov Chain

Most of the markov chain property and defintion is the same as Markov Chain. And those difference list in this page.

When we say MC is infinite, we mean the state space is infinite.

Markov Chain State

Let state i,ji,j, we define iji \to j iff jj is accessible from state ii. Addtionally, we define iji \leftrightarrow j iff iji \to j and jij \to i, and we say ii and jj can communicate to each other.

  • such \leftrightarrow is called equivalence equation which a equation is symmetric, relective and transitive.
  • k,lS,limnpkl(n)=0    i,jS,s.t.ki,jl,limnpij(n)=0\exists k, l\in S,\lim_{n \to \infty} p_{kl}^{(n)} = 0 \implies \forall i,j \in S, s.t. k\to i, j\to l, \lim_{n \to \infty} p_{ij}^{(n)} = 0.

We define a communication class as a set of states that can communicate to each other. if there exist only one communication class, then the markov chain is called irreducible. Otherwise, it is called reducible.

  • irreducible MC either has limnpji(n)=0,i,jS\lim_{n \to \infty} p_{ji}^{(n)} = 0, \forall i,j \in S or limnpij(n)0,i,jS\lim_{n \to \infty} p_{ij}^{(n)} \ne 0, \forall i,j \in S.

In sta347, we use N(i)N(i) to prsent the number steps first time visit state ii. In sta447, we use τi\tau_i it. Mathmatically, τi=min{n1:Xn=i}\tau_i = \min\{n \geq 1: X_n = i\}.

  • P(τi<)=1P(\tau_i < \infty) = 1 then XX is recurrent.
    • if such ii never leave, then ii is called absorbing.
  • P(τi<)<1P(\tau_i < \infty) < 1 then XX is transient.
  • τi=\tau_i = \infty means the chain will never returns to ii.

State in the same communication class share the same recurrent/transient property. Moveover, we define:

  • If a chain starts in a class will eventually leaves this class with probability 1, this class called transient classes
  • Else, this class called recurrent classes (i.e. states in this class will never leave)

Given S<|S| < \infty, then must exist one recurrent state ii. If it's also irreducible, then all states are recurrent.

We sometimes care about the total numebr of visits to a state ii. Let define Ri=n=0I(Xn=i)R_i = \sum_{n = 0}^{\infty} I(X_n = i) (Infinite MC):

  • if ii is recurrent, then Ri=R_i = \infty.
  • else P(Ri<)=1P(R_i < \infty) = 1.
  • The expected value of RiR_i is E[Ri]=E[n=0I(Xn=i)]=n=0P(Xn=iX0=i)=n=0Pii(n)E[R_i] = E[\sum_{n = 0}^{\infty} I(X_n = i)] = \sum_{n = 0}^{\infty} P(X_n = i| X_0 = i) = \sum_{n = 0}^{\infty} P_{ii}^{(n)}.

Periodicity

Firstly, let's define Ji={n1:pii(n)>0}J_i = \{n\ge 1: p_{ii}^{(n)} > 0\} which means the set of all possible time that visit ii. Then we have di=gcd{Ji}d_i = \gcd\{J_i\} as the period.

  • if di=1d_i = 1, then ii is aperiodic.
  • else ii is periodic.
  • all states in the same communication class share the same periodicity.

Hitting

We also define hitting time as the time that first time visit a class AA. Mathmatically, HA=min{n0:XnA}H_A = \min\{n \geq 0: X_n \in A\} where ASA \subseteq S.

  • HA=H_A = \infty if Xn∉A,nX_n \not\in A, \forall n

We also define hitting probability as the probability that first time visit a class AA from a state ii. Mathmatically, hiA=P(XnAX0=i)=P(HA<X0=i)={1if iAjSpijhjAif i∉Ah_{iA} = P(X_n \in A | X_0 = i) = P(H_A < \infty | X_0 = i) = \begin{cases} 1 & \text{if } i \in A \\ \sum_{j\in S}p_{ij}h_{jA} & \text{if } i \not\in A \end{cases}.

We also have the expected hitting time as ηiA=E[HAX0=i]={0if iA1+jSpijηjAif i∉A\eta_{iA} = E[H_A | X_0 = i] = \begin{cases} 0 & \text{if } i \in A \\ 1 + \sum_{j\in S}p_{ij}\eta_{jA} & \text{if } i \not\in A \end{cases}

Markov Chain Type

Same as the state of MC, we can also define a MC if it's aperiodic, periodic (as we described above) or transient or recurrent.

A MC XnX_n is recurrent iff xS,P(Xn=x for infinitely many n)=1\forall x\in S, P(X_n = x \text{ for infinitely many } n) = 1 which means the chain will returns to xx infinitely many times. Else, it's transient chain.

We also have conclusion about irreducible MC is transient iff the expected number of returns to a state is finite (i.e. n=0Piin<\sum_{n = 0}^{\infty} P_{ii}^{n} < \infty).

Let zz be a fixed state, and let α(x)=P(Xn=zX0=x)\alpha(x) = P(X_n = z| X_0 = x) for some n0n \ge 0. An irreducible MC is transient iff satisfies followed:

  • 0α(x)10 \le \alpha(x) \le 1 for all xSx \in S.
  • α(z)=1inf{α(x):xS}=0\alpha(z) = 1\land \inf\{\alpha(x): x \in S\} = 0.
  • α(x)=ySpxyα(y)\alpha(x) = \sum_{y \in S} p_{xy} \alpha(y) for all xS,xzx \in S, x\ne z.

Positive and Null Recurrence

For infinite MC, we have two different types of recurrence:

  • We call a chain is null recurrent iff it's recurrent and limnpxy(n)=0\lim_{n\to\infty} p_{xy}^{(n)} = 0 for all x,ySx,y \in S.
  • Else it's positive recurrent.

Recall τi=min{n1:Xn=i}\tau_i = \min\{n \geq 1: X_n = i\} which means the time that first time visit state ii. Then we have:

  • for a positive recurrent chain, E[τi]<E[\tau_i] < \infty for all iSi \in S.
  • for a null recurrent chain, E[τi]=E[\tau_i] = \infty but P(τi<)=1P(\tau_i < \infty) = 1 for all iSi \in S.
  • for a transient chain, P(τi=)>0P(\tau_i = \infty) > 0 for all iSi \in S.

If a chain is not positive recurrent, then it's either null recurrent or transient.