Recall the Martingales we learn in STA 347. In this section, we will review the definition of a martingale and some of its properties.
Introduction and Definition
Martingales are a model for stochastic processes which “stay the same on average” and a model of fair game.
That is, a sequence Xn with finite expectation is a martingale if E[Xn+1∣X0,X1,…,Xn]=Xn for all n≥0 or equivalently, ∑j∈Sjpij=i,∀i∈S.
Properties
Let Xn be a martingale,
Double-Expectation Formula: E[Xn]=E[Xn∣X0,…,Xn−1]=E[Xn−1]. By Double-Expectation Formula, we have E[Xn]=E[X0]∀n. But E[Xn]=E[X0]∀n is not a sufficient condition for Xn to be a martingale. For example:
Given a simple symmetric random walk with X0=0, then we have E[Xn+1∣X0,…,Xn]=Xn, but let some time T=inf{n:Xn=0}, then E[XT]=∑s∈SI[XT=s]XT×P(XT=s)=0=E[X0].
Stopping Time
Since we wanna discuss if there exists T might depend on the preocess self, E[XT]=E[X0] still holds or not. That is, we define a non-negative integer valued random variable T is a stopping time for Xn if the event {T=n} is determined by X0,X1,…,Xn (i.e. IT=n is a function of X0,X1,…,Xn).
- P(T=∞)>0⟹XT is not always defined, so we always assume P(T<∞)=1.
Optional Stopping Time
The example above about random walk shows T is a stopping time for Xn. What conditions should T satisfy to still satisfy E[XT]=E[X0]? That is, we have Optional Stopping theorem: If Xn is a martingale and ∃M<∞,P(T≤M)=1,E[∣XT∣]<∞,limn→∞E[XnIT>n]=0, then E[XT]=E[X0].
Also, If Xn is a martingale and ∃M<∞,P(∣Xn∣In≤T≤M)=1∧P(T≤∞)=1, then E[XT]=E[X0].
Martingale Convergence Theorem
Any martingale Xn s.t. Xn≥c∨Xn≤c for some c∈R converges w.p. 1 to some random variable X.
Branching Process
A branching process is a stochastic model for population growth. Consider a population of individuals. Let Xn denote the number of individuals at time n.
We make two assumptions about the reproduction process:
- Each individual produces offspring with the same probability distribution (i.e. ∑i=0∞=1) pk is the probability that an individual produces exactly k offspring.
- The individuals reproduce independently.
Furthermore, if Xn=k, then we can denote Y1,…,Yk are independent random variables with the same distribution P(Yi=j)=pj with mean μ. Then:
- pkj=P(Xn+1=j∣Xn=k)=P(Y1+⋯+Yk=j).
- E[Xn+1∣Xn=k]=E[Y1+⋯+Yk]=kμ.
- Further, E[Xn]=∑k=0∞P(Xn−1=k)E[Xn∣Xn−1=k]=∑k=0∞kμP(Xn−1=k)=μE[Xn−1]=μnE[X0].
From above, we can see that Xn depend on the μ. μ=1