Markov Chain
Most of the markov chain property and defintion is the same as Markov Chain. And those difference list in this page.
When we say MC is infinite, we mean the state space is infinite.
Markov Chain State
Let state , we define iff is accessible from state . Addtionally, we define iff and , and we say and can communicate to each other.
- such is called equivalence equation which a equation is symmetric, relective and transitive.
- .
We define a communication class as a set of states that can communicate to each other. if there exist only one communication class, then the markov chain is called irreducible. Otherwise, it is called reducible.
- irreducible MC either has or .
In sta347, we use to prsent the number steps first time visit state . In sta447, we use it. Mathmatically, .
- then is recurrent.
- if such never leave, then is called absorbing.
- then is transient.
- means the chain will never returns to .
State in the same communication class share the same recurrent/transient property. Moveover, we define:
- If a chain starts in a class will eventually leaves this class with probability 1, this class called transient classes
- Else, this class called recurrent classes (i.e. states in this class will never leave)
Given , then must exist one recurrent state . If it's also irreducible, then all states are recurrent.
We sometimes care about the total numebr of visits to a state . Let define (Infinite MC):
- if is recurrent, then .
- else .
- The expected value of is .
Periodicity
Firstly, let's define which means the set of all possible time that visit . Then we have as the period.
- if , then is aperiodic.
- else is periodic.
- all states in the same communication class share the same periodicity.
Hitting
We also define hitting time as the time that first time visit a class . Mathmatically, where .
- if
We also define hitting probability as the probability that first time visit a class from a state . Mathmatically, .
We also have the expected hitting time as
Markov Chain Type
Same as the state of MC, we can also define a MC if it's aperiodic, periodic (as we described above) or transient or recurrent.
A MC is recurrent iff which means the chain will returns to infinitely many times. Else, it's transient chain.
We also have conclusion about irreducible MC is transient iff the expected number of returns to a state is finite (i.e. ).
Let be a fixed state, and let for some . An irreducible MC is transient iff satisfies followed:
- for all .
- .
- for all .
Positive and Null Recurrence
For infinite MC, we have two different types of recurrence:
- We call a chain is null recurrent iff it's recurrent and for all .
- Else it's positive recurrent.
Recall which means the time that first time visit state . Then we have:
- for a positive recurrent chain, for all .
- for a null recurrent chain, but for all .
- for a transient chain, for all .
If a chain is not positive recurrent, then it's either null recurrent or transient.