Skip to main content

Recurrences

Review of time limit:

f=O(g)f= O(g) (Big-O): kR0,(n0N(nN.(n>n0    f(n)kg(n))))\exists k \in \R_{0},(\exists n_0 \in \N(\forall n\in \N. (n > n_0\implies f(n)\le kg(n))))

  • we can use big-O to describe functions' upper bound by some common function where as larger input there exists such kk times the function upper the current function
  • aka fgf \le g

f=Ω(g)f = \Omega(g) (Big-Omega): kR>0,(n0N(nN.(n>n0    f(n)kg(n))))\exists k \in \R_{>0},(\exists n_0 \in \N(\forall n\in \N. (n > n_0\implies f(n)\ge kg(n))))

  • we can use big-Omega to describe functions' lower bound by some common function where as larger input there exists such kk times the function lower the current function
  • fgf\ge g

f=Θ(g)f = \Theta(g) (Big-Theta): k1,k2R>0,(n0N(nN.(n>n0    k1g(n)f(n)k2g(n))))\exists k_1,k_2 \in \R_{>0},(\exists n_0 \in \N(\forall n\in \N. (n > n_0\implies k_1g(n)\le f(n)\le k_2g(n))))

  • There exist another function bounded function in some area k1,k2k_1,k_2 as larger input
  • f=O(g)f=Ω(g)    f=Θ(g)f=O(g) \land f = \Omega(g)\iff f = \Theta(g)
  • fgf\approx g

f=o(g)f = o(g) (Little-o): kR>0,(n0N(nN.(n>n0    f(n)<kg(n))))\forall k \in \R_{>0},(\exists n_0 \in \N(\forall n\in \N. (n > n_0\implies f(n) < kg(n))))

f=ω(g)f = \omega(g) (Little-omega): kR>0,(n0N(nN.(n>n0    f(n)>kg(n))))\forall k \in \R_{>0},(\exists n_0 \in \N(\forall n\in \N. (n > n_0\implies f(n) > kg(n))))

loglog in time limit analysis is always stand for log2log_2 and we always use φ\varphi to present the golden ratio

1<log(n)<n0.001<n<nlog(n)<n1.001<n1000<1.001n<2n1 < log(n) < n^{0.001} < n < nlog(n) < n^{1.001} < n^{1000} < 1.001^{n} < 2^n for limnlim_{n\to \infty}

For a recurrences, we can simply use rnr^n to judge the time complexity. For example, the Fibonacci:

  • Assume the time complexity of P(k+1)P(k+1) is rk+1r^{k+1}, and P(k+1)=P(k)+P(k1)P(k+1) = P(k) + P(k-1) has time complexity of rkr^k and rk1r^{k-1}
  • then we have rk+1rk+rk1    rk+1rkrk10    r2r10r^{k+1} \ge r^k + r^{k-1}\implies r^{k+1} - r^k - r^{k-1}\ge 0 \implies r^{2} - r - 1 \ge 0

To prove T=Θ(f)T = \Theta(f):

First prove T=O(f)T = O(f)

  1. Remove all the floors and ceilings from the recurrence TT
  2. Make a guess for ff such that T(n)=O(f(n))T(n) = O(f(n))
  3. write out the recurrence: T(n)=T(n) = \ldots
  4. whenever T(k)T(k) appears on the RHS of the recurrence, substitute it with cf(k)cf(k)
  5. Try to prove T(n)cf(n)T(n) \le cf(n)
  6. Pick cc to make your analysis work

Then use the same way to prove Ω(f)\Omega(f)


A recurrence is in standard form if it is written as T(n)=aT(n/b)+f(n)T(n) = aT(n/b) + f(n). For some constants a1,b>1a\ge 1, b > 1, and some function f:NRf: \N \to \R

  • a:a: the branching factor of the tree (how many children)
  • b:b: the reduction factor (length compare to main)
  • f(n)f(n): the time complexity of non-recursive work
  • The height of the tree is logb(n)log_b(n)
  • The number of vertices at level hh is aha^h
  • The total non-recursive work done at level hh is ahf(n/bh)a^hf(n/b^h)
    • Root Work. f(n)f(n)
    • Leaf Work. alogb(n)f(1)=Θ(nlogb(a))1a^{log_b(n)} f(1) = \Theta(n^{log_b(a)})^1
  • Summing up the levels, the total amount of work done is h=0logb(n)ahf(n/b)\sum_{h= 0} ^{log_b(n)} a^hf(n/b)

We always use Master Theorem to solve such standard form; Let T(n)=aT(n/b)+f(n)T(n) = aT(n/b) + f(n), then there are three cases defined by the leaf work:

  1. Leaf heavy: f(n)=O(nlogb(a)ϵ)f(n) = O(n^{log_b(a) - \epsilon}) for some constant ϵ>0\epsilon > 0

  2. Balanced: f(n)=Θ(nlogb(a))f(n) = \Theta(n^{log_b(a)})

  3. Root heavy: f(n)=Ω(nlogb(a)+ϵ)f(n) = \Omega(n^{log_b(a) + \epsilon}) for some constant ϵ>0\epsilon > 0, and af(n/b)cf(n)af(n/b) \le cf(n) for some constant c<1c < 1 for all sufficiently large nn (Regularity Condition)

Combine those cases, we have: T(n)={Θ(nlogb(a))Leaf heavy CaseΘ(nlogb(a)log(n))Balanced CaseΘ(f(n))Root heavy CaseT(n) = \begin{cases} \Theta(n^{log_b(a)}) & \text{Leaf heavy Case} \\ \Theta(n^{log_b(a)}log(n)) & \text{Balanced Case} \\\Theta({f(n)}) & \text{Root heavy Case} \end{cases}

Base on this, we:

  1. Write the recurrence in normal form to find the parameters a,b,fa,b,f
  2. Compare nlogb(a)n^{log_b(a)} to ff to determine the case split
  3. Read off the asymptotics from the relevant case

Simplified Master Theorem: let f=ncf = n^c T(n)={Θ(nlogb(a))a>bcΘ(nclog(n)a=bc)Θ(nc)a<bcT(n) = \begin{cases}\Theta(n^{\log_b(a)}) & a> b^c \\ \Theta(n^c \log(n) & a = b^c) \\ \Theta(n^c) & a <b^c \end{cases}