CS255
Chris Pollett
Jan 31, 2022
To do probabilistic analysis and to understand randomized algorithms, we need to know a little probability -- so let's review.
Which of the following statements is true?
Given a sample space `S` and an event `A subseteq S`, let `X_A=I{A}`. Then `E[X_A]=Pr{A}`.
Proof: `E[X_A] = E[I{A}] = 1 cdot Pr{A} + 0 cdot Pr{bar(A)} = Pr{A}`.
Lemma. Assume that the candidates are presented in random order, then algorithm Hire-Assistant has a hiring cost of `O(c_h ln n)`.
Proof. From before hiring cost is `O(m cdot c_h)` where `m` is the number of candidates hired. From the previous slide this is `O(ln n)`.
Randomized-Hire-Assistant(n) 1 randomly permute the list of candidates 2 best := dummy candidate 3 for i := 1 to n 4 do interview of candidate i 5 if i is better than best 6 best := i 7 hire candidate i
Permute-By-Sorting(A) 1. n :=length[A] 2. for i :=1 to n 3. P[i] = Random(1,n3) 4. sort A, using P as sort keys 5. return A.
Lemma. Procedure Permute-By-Sorting produces a uniform random permutation of the input, assuming that the priorities are distinct.
Proof. Let `sigma:[1 .. n] ->[1..n]` be a permutation, `sigma(i)` being where `i` goes under this permutation. Let `X_i` be the indicator that `A[i]` receives the `sigma(i)`th smallest priority. That is, it indicates that `i` will be mapped correctly after sorting by priorities. So if `X_i` holds then after sorting the element with original value `i` stored in `A[i]` gets mapped to `A[sigma(i)]`. By the definition of conditional probability,
`Pr{Y|X} = (Pr{X cap Y})/(Pr{X})`, so `Pr{X cap Y} = Pr{X} cdot Pr{Y|X}`.
Using this, we have
`Pr{X_1 cap ... cap X_n} = Pr{X_1 cap ... capX_(n-1)} cdot Pr{X_n | X_1 cap ... cap X_(n-1)}`.
Continuing to expand, we get:
`Pr{X_1 cap ... cap X_n} = Pr{X_1} cdot Pr{X_2|X_1} cdots Pr{X_n | X_1 cap ... cap X_(n-1)}`.
We can now fill in some of these values:
`Pr{X_1} = 1/n = ` probability that first priority chosen out of `n` is `sigma(1)`th smallest.
`Pr{X_i|X_1 cap... cap X_(i-1)} = 1/(n - i + 1)`.
This is because of the remaining elements `i`, `i+1`, ... `n`, each is equally likely to be the `sigma(i)`th smallest.
So
`Pr{X_1 cap ... cap X_n} = 1/n cdot 1/(n-1) cdot ... cdot 1/2 cdot 1/1 = 1/(n!)`.
As `sigma` was arbitrary, any permutation is equally likely.