Skip to main content

Basic Properties of Probabilities

We use Ω\Omega or SS present the probability space, X\mathbb{X} present some other space;

σ\sigma-algebra

Let Ω\Omega be some abstract space, Let F(Ω),\mathcal{F}\in \wp(\Omega), F\mathcal{F} is a σ\sigma-algebra if satisfies the following properties:

  1. AΣ    AcΣA\in \Sigma \implies A^c \in \Sigma
  2. Closed under the formation of complements, countable unions and countable intersections.
  3. ΣΩΣ\empty\in \Sigma \land \Omega\in \Sigma

Usually, we have some σ\sigma-algebra property:

  • The intersection of σ\sigma-algebra is still σ\sigma-algebra, but union is not

Measure and Probability

Let μ:F[0,]\mu:\mathcal{F} \to [0, \infty] be a function, is measure, if:

  • μ()=0\mu(\empty) = 0
  • (An)nNF,AiAj=,ij    μ(nNAn)=nNμ(An)\forall (A_n)_{n\in \N} \subseteq \mathcal{F}, A_i \cap A_j =\empty, i\ne j \implies \mu(\bigcup_{n\in \N} A_n) = \sum_{n\in \N} \mu(A_n)

A measure μ:Σ[0,]\mu:\Sigma \to [0, \infty] is a probability     μ(Ω)=1\iff \mu(\Omega) = 1

That's a probability is a measure, we call it probability measure (PP or P\mathbb{P}). Let ASA\subseteq S be an arbitrary event so that probability measure has properties:

  • 0P(A)1,P(A)R0\le P(A) \le 1, P(A) \in \R
  • P()=0,P(S)=1P(\emptyset) = 0, P(S) = 1
  • PP is countably additive. AnA_n are disjoint nN,n>0,s.t.A1A2=A    P(A1A2)=P(A1)+P(A2)+\forall n \in \N, n > 0, s.t. A_1 \cup A_2 \cup \ldots = A \implies P(A_1 \cup A_2 \cup \ldots) = P(A_1) + P(A_2) + \ldots

Measure Space and Probability Space

Let F(X)\mathcal{F}\subseteq \wp(\mathbb{X}) be σ\sigma-algebra and μ\mu be a measure on F\mathcal{F}.

  • (X,F)(\mathbb{X}, \mathcal{F}) is a measurable space
  • (X,F,μ)(\mathbb{X}, \mathcal{F}, \mu) is a measure space
  • μ(X)=1    \mu(\mathbb{X}) = 1\implies denote X\mathbb{X} as Ω\Omega, μ\mu as P\mathbb{P}, then we have a measure space become a probability space (Ω,F,P)(\Omega, \mathcal{F}, \mathbb{P})

AF,μ(A)=0,NA\forall A\in \mathcal{F}, \mu(A) = 0, N\subseteq A is a Null set. NcΣN^c \in \Sigma is a true μ\mu-almost surely.

Properties of Measure Space

Let (X,F,μ)(\mathbb{X},\mathcal{F}, \mu) be a measure space, then for any A,B,A1,A2,FA,B,A_1,A_2,\ldots \in \mathcal{F}:

  • AB    μ(A)μ(B)A\subseteq B\implies \mu(A) \le\mu(B)
  • AnNAn    μ(A)nNμ(An)A\subseteq \bigcup_{n\in \N}A_n\implies \mu(A) \le \sum_{n\in\N}\mu(A_n)
  • As sequence (An)nN(A_n)_{n\in \N} increases to A    limnP(An)=P(A)A \implies \lim_{n\to \infty} \mathbb{P}(A_n) = \mathbb{P}(A)
  • P(A)+P(Ac)=1\mathbb{P}(A) + \mathbb{P}(A^c) = 1
  • (An)nN(A_n)_{n\in \N} decreases to A    limnP(An)=P(A)A \implies \lim_{n\to \infty} \mathbb{P}(A_n) = \mathbb{P}(A)

Properties of Probability

Continuity of Probability: Let F(Ω)\mathcal{F} \subseteq \wp(\Omega) be a σ\sigma-algebra. Suppose (An)nNΣ(A_n)_{n\in \N}\subseteq \Sigma and limnAn=A\lim_{n\to \infty}A_n = A. Then, AFA\in \mathcal{F} and limnP(An)=P(A)\lim_{n\to\infty}\mathbb{P}(A_n) = \mathbb{P}(A)

In probabilities, we have 3 kinds of probability need to calculate where: Marginal probability P(A)P(A), Joint Probability P(AB)P(A\cap B) and Conditional Probability P(AB)P(A|B).

  • Joint Probability = Marginal Probability ×\times Conditional Probability where P(AB)=P(BA)P(A)P(A\cap B) = P(B|A)P(A)
  • Conditional Probability only work on dependent situation, where AB    P(B)=P(BA)A\perp B \iff P(B) = P(B|A) and also AB    P(AB)=P(A)P(B)A\perp B \implies P(A\cap B) = P(A)P(B)

Let A,B,CSA,B,C \subseteq S where BC=SB \cup C = S then we have A=AS=A(BC)A = A \cap S = A\cap(B\cup C). Similarly, we have the probability of AA P(A)=P(AB)+P(AC)=P(AB)P(B)+P(AC)P(C)P(A) = P(A \cap B) + P(A\cap C) = P(A|B)P(B)+ P(A|C)P(C)

  • From this, we can conclude, Bi,i1,i=1Bi=B    P(A)=P(AB)=i=1P(ABi)P(Bi)\forall B_i, i\ge 1, \bigcup_{i = 1} B_i = B \implies P(A) = P(A\cap B) = \sum_{i = 1} P(A|B_i)P(B_i)

(Bayes Theorem): P(AB)=P(AB)P(B)=P(BA)P(A)P(A\cap B) = P(A|B)P(B) = P(B|A)P(A) so that P(BA)=P(AB)P(B)P(A)P(B|A) = \frac{P(A|B)P(B)}{P(A)} and verse visa. More stronger we have P(A)=i=1P(ABi)P(Bi)P(A) = \sum_{i = 1} P(A|B_i)P(B_i) so that P(BA)=P(AB)P(B)i=1P(ABi)P(Bi)P(B|A) = \frac{P(A|B)P(B)}{\sum_{i = 1} P(A|B_i)P(B_i)}

Independent implies uncorrelated, but converse is false.