Quicksort Quicksort: Introduction To Algorithms Introduction To Algorithms
Quicksort Quicksort: Introduction To Algorithms Introduction To Algorithms
g Review
z Insertion Sort
Introduction to Algorithms z T(n) = Θ(n2)
In-place
Quicksort z
z
Merge
g Sort
z T(n) = Θ(n lg(n))
z Not in-place
CSE 680 z Selection Sort (from homework)
z T(n) = Θ(n2)
Prof. Roger Crawfis z In-place Seems pretty good.
z Heap Sort Can we do better?
z T(n) = Θ(n lg(n))
z In-place
Sorting
g Comparison
p Sorting
g
Example
p ((Continued)) Partitioning
g
next iteration: 2 5 3 8 9 4 1 7 10 6 z Select the last element A[r] in the subarray
i j A[p..r] as the pivot – the element around which
next iteration: 2 5 3 8 9 4 1 7 10 6 to partition.
i j
next iteration: 2 5 3 4 9 8 1 7 10 6 Partition(A, p
Partition(A p, r) z As the pprocedure executes,, the arrayy is
i j x, i := A[r], p – 1; partitioned into four (possibly empty) regions.
next iteration: 2 5 3 4 1 8 9 7 10 6 for j := p to r – 1 do 1. A[p..i ] — All entries in this region are < pivot.
i j if A[j] ≤ x then 2. A[i+1..j
A[i 1..j – 1] — All entries in this region are > pivot.
next iteration: 2 5 3 4 1 8 9 7 10 6 i := i + 1; 3. A[r] = pivot.
i j A[i] ↔ A[j] 4. A[j..r – 1] — Not known how they compare to pivot.
next iteration: 2 5 3 4 1 8 9 7 10 6 A[i[ + 1]] ↔ A[r];
[] z The above hold before each iteration of the for
i j return i + 1
loop, and constitute a loop invariant. (4 is not part
after final swap: 2 5 3 4 1 6 9 7 10 8 of the loopi.)
i j
Correctness of Partition Correctness of Partition
z Use loop invariant. Case 1:
z I iti li ti
Initialization:
p i j r
z Before first iteration
>x x
A[p..i]
A[p i] and A[i+1..j
z A[i+1 j – 1] are empty – Conds.
Conds 1 and 2 are satisfied
(trivially).
z r is the index of the pivot Partition(A, p, r) ≤x >x
z Cond.
Cond 3 is satisfied
satisfied. x i ::= A[r],
x, A[r] p – 1; p i j r
for j := p to r – 1 do
z Maintenance: if A[j] ≤ x then
x
≤x >x
Complexity
p y of Partition Quicksort Overview
Partitioning
g in Quicksort Alternative Partitioning
g
static int Partition(int[] a, int left, int right) { static void QQuicksort(int[]
( [] array,
y, int left,, int right)
g )
i t p = a[left],
int [l ft] l = lleft
ft + 1
1, r = right;
i ht
while (l < r) { {
while (l < right && a[l] < p) l++; if (left < right) {
while (r > left && a[r] >= p) r--; r ; int p = Partition(array,
Partition(array left,
left right);
if (l < r) { Quicksort(array, left, p - 1);
int temp = a[l]; a[l] = a[r]; a[r] = temp; Quicksort(array, p + 1, right);
}
} }
a[left] = a[r]; }
[ ] = p;
a[r]
return r;
}
Analysis of quicksort—best case Partitioning
g at various levels
z We cut the array size in half each time z In the worst case, partitioning always
z So the depth of the recursion in log2n divides the size n array into these three
z At each level of the recursion
recursion, all the parts:
partitions at that level do work that is linear z A length one part, containing the pivot itself
in n z A length zero part, and
z th n-1
A length
l 1 part,
t containing
t i i everything
thi elsel
z O(log2n) * O(n) = O(n log2n)
z We don’t recur on the zero-length part
z Hence in the best case, quicksort has time
complexity O(n log2n) z R
Recurring
i on th th n-1
the llength n 1 partt requires
i
(in the worst case) recurring to depth n-1
z What about the worst case?
Worst case partitioning
p g Worst case for quicksort
q
z In the worst case, recursion may be n levels deep
i n))
(f an array off size
(for
z But the partitioning work done at each level is still n
z O(n) * O(n) = O(n2)
z So worst case for Quicksort is O(n2)
z When does this happen?
z There are many arrangements that could make this
happen
z Here are two common cases:
z When the array is already sorted
z When the array is inversely sorted (sorted in the opposite
order)
Typical
yp case for q
quicksort Picking
g a better p
pivot
z If the array is sorted to begin with, z Before, we ppicked the first element of the
Quicksort is terrible: O(n2) subarray
b t use as a pivot
to i t
z If the array is already sorted, this results in
z It is p
possible to construct other bad cases O(n2) behavior
z It’s no better if we pick the last element
z However, Quicksort is usually O(n log2n) z We could do an optimal quicksort
z The constants are so good that Quicksort is (guaranteed O(n log n)) if we always picked
generally the faster algorithm. a pivot value that exactly cuts the array in
half
z Most real-world
real world sorting is done by z Such a value is called a median: half of the
values in the array are larger, half are smaller
Quicksort z The easiest way to find the median is to sort
the array
y and ppick the value in the middle ((!))
Median of three Quicksort for Small Arrays
y
z Obviously, it doesn’t
doesn t make sense to sort the z For very small arrays (N<= 20)
20), quicksort
array in order to find the median to use as a does not perform as well as insertion sort
p
pivot.
z A good cutoff range is N=10
z Instead, compare just three elements of our
z Switching to insertion sort for small
((sub)array—the
) y first,, the last,, and the middle
z Take the median (middle value) of these three as
arrays can save about
b t 15% iin th the
the pivot running time
z It’s possible (but not easy) to construct cases which
will make this technique O(n2)
Mergesort
g vs Quicksort Mergesort
g vs Quicksort