📄️ Basic Properties of Probabilities
We use $\Omega$ or $S$ present the probability space, $\mathbb$ present some other space;
📄️ Markov Chain
Most of the markov chain property and defintion is the same as Markov Chain. And those difference list in this page.
📄️ Simple Random Walk
Let the state space be $\mathcal = \Z$, a process $Xt$ has probability $p$ moving forward and probability $1-p$ moving backward (i.e. $p{i, i+1} = p$ and $p_{i, i-1} = 1-p$). The process is called a simple random walk.
📄️ Markov Chain Behavior
As we learn more about Markov chains (especially the infinite ones), we would like to find out the behavior of the chain. One of the interesting behavior is the stationary distribution.
📄️ Random Variable
A random variable is function from the sample space $S$ to the set of all real numbers $\R$. we can denote such function as $X(s): S\to \R$
📄️ Expected Values
Expectation is a location measure which give the location of the center of a random variable.
📄️ Markov Chain Monte Carlo
Markov chain Monte Carlo (MCMC) is a methodology that use Markov sequences to efficiently model otherwise intractable distributions. That is, Given a probability distribution $\pi$, the goal of MCMC is to simulate a random variable $X$ whose distribution is $\pi$. Such distribution may continuous or discrete even though we assume it's discrete.
📄️ Martingales
Recall the Martingales we learn in STA 347. In this section, we will review the definition of a martingale and some of its properties.
📄️ Random Variables Convergence
From the analysis courses, we somehow learn the convergence of sequence, the convergence of series, etc. Probability as the branch of matrics is also a branch of analysis. So, we can also talk about the convergence of probability.
📄️ Change of Variable
Discrete
📄️ Continuous Processes
In previous section, we discussed the discrete-time processes. We now define a continuous time stochastic process $Xt$ has **Markov property** if $P(Xt = y| Xr, 0 \le r \le s) = P(Xt = y| Xs)$ for all $t \ge s$ and **Time-homogeneous** if $P(X{t+s} = y| Xs = x) = P(Xt = y| X_0 = x)$.
📄️ Distributions
Discrete Distributions
📄️ Inequality
(BOOLE'S INEQUALITY): $P(\bigcup{k=1}^{\infty} Ak)\le \sum{k=1} ^{\infty} P(Ak)$
📄️ Stochastic Process
If random variables represent a process that proceed randomly in time, then it's Stochastic Process.
📄️ Markov Chain
Markov Chain theorems are concerned with what will happen in a long run.
📄️ Monte Carlo Approximation
Let $X1, X2, \ldots$ be a sequence of i.i.d. random variables with mean $\mu$ we have $Mn = \frac{n} \sum{i=1}^n Xi$ and LLN tells us that $Mn \approx \mu$ as $n \to \infty$. If we dont know the $\mu$, we can use $M_n$ as an estimator or approximation of $\mu$. This is called Monte Carlo approximation.
📄️ Review
Real analysis
📄️ STA347 Probability
Instructor: Mohammad Kaviul Khan
📄️ STA447 Stochastic Processe
Instructor: Omidali (Omid) Aghababaei Jazi