This algorithm is essentially a disguised binary search. Given array A of size n, look at element A[m = n/2]. If A[m] > m, then m is smaller than every element in A[m..n-1] and the answer can’t be in that part of the array. If A[m] < m, then m is larger than every element in A[1..m]. If A[m] = m, then the algorithm’s done.
Each step in the algorithm throws away half the array, which leads to logarithmic worst case behavior.
A problem of size n results in two sub-problems of size n/2 and a bunch of constant work for division and reconbination.
T(n) = 2T(n/2) + O(1)Applying the same reasoning one more time gives
T(n/2) = 2T(n/4) + O(1)Substituting back gives
This can be repeated at most ⌈log n⌉ times to give
T(n) = 2(2T(n/4) + O(1)) + O(1) = 4T(n/4) + 2O(1) + O(1) = 4T(n/4) + O(1)
T(n) = nT(n/n) + O(1)The trivial problem T(1) can be solved in constant time, which gives T(n) linear order.
Partitioning is the part of quicksort to think about. Given an array of lenth n and a pivot value A[i], 0 ≤ i < n, a partition of the array re-arranges the elements of the array such that all elements to the left of the pivot value are at most the pivot value and the elements to the right of the pivot value are at least the pivot value.
If the array has size one, the anwer's clear. Otherwise partition the array about the element at the median index m = n/2. If, after partitioning the median element is still at m, then stop. Otherwise the median element has shifted to one side of the array; partition again on the half of the array containing the median. This can repeat at most ⌈log n⌉ times in the worst case.
By an argument similar to that given in the previous question, worst-case partitioning takes O(n) work, but in this case division takes linear time. However, the cost of linear division effort can be rolled into the over-all cost for a total linear effort.
This page last modified on 24 November 2008. |