Common Functions, Recurrences




CS146

Chris Pollett

Feb 17, 2014

Outline

Introduction

Comparing Functions

Our asymptotic notations enjoy the following useful properties which we might use when analyzing algorithms:

Transitivity
Let `H(f(n))` represent one of the notations `Theta(f(n))`, `O(f(n))`, `o(f(n))`, `Omega(f(n))`, `omega(f(n))`. Then: `f(n) = H(g(n))` and `g(n) = H(h(n))` implies `f(n) = H(h(n))`.
Reflexivity
Let `H(f(n))` represent one of the notations `Theta(f(n))`, `O(f(n))`, `Omega(f(n))`. Then: `f(n) = H(f(n))`
Symmetry
`f(n) = Theta(g(n))` if and only if `g(n) = Theta(f(n))`
Transpose Symmetry
`f(n) = O(g(n))` if and only if `g(n) = Omega(f(n))`
`f(n) = o(g(n))` if and only if `g(n) = omega(f(n))`

Standard Notations

Monotonicity
`f(n)` is monotonically increasing if `m le n` implies `f(m) le f(n)`. It is monotonically decreasing if `m le n` implies `f(m) ge f(n)`. If `le` and `ge` can be replaced with `<` and `>` in the above then we say respectivelythat `f` is strictly increasing or strictly decreasing.
Floors and ceilings
For `x in RR`, we denote the greatest integer less than or equal to `x` by `lfloor x rfloor` and the least integer greater than or equal to `x` by `|~ x ~|`. These functions obey:
`x-1 < lfloor x rfloor le x le |~x ~| < x+1`
`|~n/2~| + lfloor n/2 rfloor = n`.
For any `x ge 0` and `a,b >0`, `|~(|~x/a~|)/(b)~| = |~x/(ab)~|` and `lfloor(lfloor x/a rfloor)/(b)rfloor = lfloor x/(ab)rfloor`
Modular Arithmetic
For `a in ZZ, n in NN, n >0`, the value of `a mod n` is the remainder (or residue) of the quotient `a/n`:
`a mod n = a - n lfloor a/n rfloor`. If `(a mod n) = (b mod n)`, we write `a -= b (mod n)` and say that `a` is equivalent to `b`, modulo `n`.

Quiz (Sec 5)

Which of the following statements is true?

  1. It is impossible to come up with a loop invariant for the MERGE operation.
  2. Insertion Sort was our first example of a divide and conquer algorithm.
  3. If `f(n) in Theta(g(n))` then `f(n) in O(g(n))`

Quiz (Sec 6)

Which of the following statements is true?

  1. It is impossible to come up with a loop invariant for the MERGE operation.
  2. The recurrence equation for Merge Sort has two cases: One corresponds to the base case, the other corresponds to the sum of the costs of dividing into subproblems, conquering subproblems, and combining the results.
  3. If `f(n) in O(g(n))` then `f(n) in Theta(g(n))`

Common Functions

Polynomials
Let `d in NN`, a polynomial of degree d is a function `p(n)` of the form: `p(n) = sum_(i = 0)^d a_i n^i`
where `a_0, a_1, ... a_d` from some field (usually `RR`) are the coefficients of the polynomial. We require `a_d ne 0`. The function will be asymptotically positive if `a_d > 0`. We say that a function `f(n)` is polynomially bounded if `f(n) = O(n^k)` for some constant `k`.
Exponentials
For `a >0, a,m,n in RR`, the following hold:
`a^0 = 1,`
`a^1 = a,`
`a^(-1) = 1/a,`
`(a^m)^n = a^(mn) = (a^n)^m`,
`a^ma^n = a^(m+n)`.
We define `0^0 = 1`. For any real constants `a, b` such that `a > 1`,
`lim_(n->infty)(n^b)/(a^n) = 0`.
So `n^b = o(a^n)`. Recall Euler number `e` is `2.71828...` and the Taylor series for `e^x` is given by `e^x = 1 + x +(x^2)/(2!) + ... = sum_(i=0)^(infty)(x^i)/(i!)`.

More Common Functions

Logarithms
The book uses `lg n` for `log_2 n`, we will use `log n` for `log_2 n`. We will use `ln n` for `log_e n`. Finally, we write `log^k n` to mean `(log n)^k` (so `log^2 n = log n times log n`) and `log log n` or `log^{(2)}n` for `log(log n)`. In general, `log^{(1)}n = log n`, `log^{(k+1)}n = log(log^{(k)}n)` Some properties of logarithms:
`a = b^(log_b a)`
`log_c(ab) = log_c a + log_c b`
`log_b a^n = n log_b a`
`log_b a = (log_c a)/(log_c b)`
`log_b (1/a) = - log_b a`
`log_b a = 1/(log_a b)`
`a^(log_b c) = c^(log_b a)`
The Taylor series for `ln (1+ x)` when `|x| < 1` is
`ln(1 + x ) = x - (x^2)/2 + (x^3)/3 - (x^4)/4 +...`
Factorials
We define the factorial function as
`n! = {(1, if n=0),(n cdot(n-1)!,if n > 0):}`
Stirling's Approximation says
`n! = sqrt(2 pi n)(n/e)^n(1 +Theta(1/n))`
One can check
`n! = o(n^n)`
`n! = omega(2^n)`
`log (n!) = Theta(n log n)`

Iterating Functions

Divide and Conquer Recurrences

Three Methods to Solve Recurrences

Substitution Method
We guess a bound and then use mathematical induction to prove our guess was correct.
Recursion-tree Method
We convert the recurrence into a tree whose nodes represent the costs incurred at various levels of the recursion. We use techniques for bounding summations to solve the recurrence.
Master Method
We use a theorem called the master theorem which gives bounds for recurrences of the form
`T(n) = a T(n/b) + f(n)`.