Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Gags

Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 75

Divide and Conquer

 divide the problem into a number of subprobl


ems
 conquer the subproblems (solve them)
 combine the subproblem solutions to get the
solution to the original problem

Note: often the “conquer” step is done recursive


ly
1 12/07/21 03:35 PM http://www.nurul.com
Divide and Conquer
 A common approach to solving a problem is to partiti
on the problem into smaller parts, find solutions for th
e parts, and then combine the solutions for the parts i
nto a solution for the whole.
 This approach, especially when used recursively, ofte
n yields efficient solutions to problems in which the su
b-problems are smaller versions of the original proble
m. We illustrate the technique with some examples fo
llowed by an analysis of the resulting recurrence equ
ations.

12/07/21 03:35 PM http://www.nurul.com 2


Recursive algorithm
 to solve a given problem, they call them
selves recursively one or more times to
deal with closely related subproblems.
 usually the subproblems are smaller in size
than the `parent' problem
 divide-and-conquer algorithms are often re
cursive

12/07/21 03:35 PM http://www.nurul.com 3


Algorithm for General Divide and
Conquer Sorting
Algorithm for General Divide and Conquer Sorting
Begin Algorithm
Start Sort(L)
If L has length greater than 1 then
Begin
Partition the list into two lists, high and low
Start Sort(high)
Start Sort(low)
Combine high and low
End
End Algorithm

12/07/21 03:35 PM http://www.nurul.com 4


Analyzing Divide-and-Conquer Al
gorithms

 When an algorithm contains a recursive call to it


self, its running time can often be described by a
recurrence equation which describes the overall
running time on a problem of size n in terms of t
he running time on smaller inputs.
 For divide-and-conquer algorithms, we get recurr
ences that look like:
(1) if n < c
T(n) =  aT(n/b) +D(n) +C(n) otherwise

12/07/21 03:35 PM http://www.nurul.com 5


Example: Towers of Hanoi
Again!!
 Problem: Move n disks from peg A to B using
peg C as a temporary peg
 Divide and Conquer approach:
 Two subproblems of size n-1:
(1) Move the n-1 smallest disks from A to C
(*) Move nth smallest disk from A to B (easy)
(2) Move the n-1 smallest disks from C to B
 Moving n-1 smallest disks is done by recursive
application the method

12/07/21 03:35 PM http://www.nurul.com 6


Example: Merge Sort
 divide the n-element sequence to be sor
ted into two n/2 element sequences
 conquer: sort the subproblems, recursiv
ely using merge sort
 combine: merge the resulting two sorte
d n/2 element sequences

12/07/21 03:35 PM http://www.nurul.com 7


Example: Merge Sort Analysis

(1) if n < c
T(n) = 2T(n/2) n) otherwise

a = 2(two subproblems)
n/b = n/2 (each subproblem has size approx n/2)
D(n) = (1) (just compute midpoint of array)
C(n) =n) (merging can be done by scanning sorted
subarrays)

12/07/21 03:35 PM http://www.nurul.com 8


Finding The Maximum and Min
imum of N-Elements
 Consider the problem of finding both the
maximum and the minimum elements of
a set S containing n elements.
 For simplicity we shall assume the n is a
power of 2.
 One obvious way to find the maximum a
nd minimum elements would be to find e
ach separately.
12/07/21 03:35 PM http://www.nurul.com 9
Algorithm
 finds the maximum element of S in n - 1 c
omparisons between elements of S.
 find the minimum of the remaining n - 1 el
ements with
n - 2 comparisons, giving a total of 2n - 3
comparisons to find the maximum and min
imum, assuming n

12/07/21 03:35 PM http://www.nurul.com 10


Algorithm (cont.)
 The divide-and-conquer approach woul
d divide the set S into two subsets S1 a
nd S2, each with n/2 elements.
 The algorithm would then find the maxi
mum and minimum elements of each of
the two halves, by recursive application
s of the algorithm.

12/07/21 03:35 PM http://www.nurul.com 11


Pseudocode
Procedure MAXMIN(S):
/* This procedure returns the max and min elements of S. */
case S = {a } : return (a, a)
begin
case S {a, b} : return (MAX(a, b), MIN(a, b))
else : divide S into two subsets S1 and S2, each with half th
e elements;
(max1, min1) ฌMAXMIN(S1);
(max2, min2) ฌMAXMIN(S2);
return (MAX(max1, max2), MIN(min1, min2))

12/07/21 03:35 PM http://www.nurul.com 12


Merge Sort
 Apply divide-and-conquer to sorting problem
 Problem: Given n elements, sort elements into
non-decreasing order
 Divide-and-Conquer:
 If n=1 terminate (every one-element list is already
sorted)
 If n>1, partition elements into two or more sub-
collections; sort each; combine into a single sorted list
 How do we partition?

12/07/21 03:35 PM http://www.nurul.com 13


Partitioning - Choice 1
 First n-1 elements into set A, last element set B

 Sort A using this partitioning scheme recursively


 B already sorted
 Combine A and B using method Insert() (=
insertion into sorted array)
 Leads to recursive version of InsertionSort()
 Number of comparisons: O(n2)
 Best case = n-1
 Worst case = n
n(n  1)
ci 
i 2 2

12/07/21 03:35 PM http://www.nurul.com 14


Partitioning - Choice 2
 Put element with largest key in B, remaining
elements in A
 Sort A recursively
 To combine sorted A and B, append B to sorted A
 Use Max() to find largest element  recursive
SelectionSort()
 Use bubbling process to find and move largest element
to right-most position  recursive BubbleSort()
 All O(n2)

12/07/21 03:35 PM http://www.nurul.com 15


Partitioning - Choice 3
 Let’s try to achieve balanced partitioning
 A gets n/k elements, B gets rest
 Sort A and B recursively
 Combine sorted A and B using a process
called merge, which combines two sorted
lists into one
 How? Just like merging sorted runs when
doing an external sort
12/07/21 03:35 PM http://www.nurul.com 16
Merge Sort

• Idea:
– Take the array you would like to sort and divide
it in half to create 2 unsorted subarrays.
– Next, sort each of the 2 subarrays.
– Finally, merge the 2 sorted subarrays into 1
sorted array.
• Efficiency: O(nlog2n)

12/07/21 03:35 PM http://www.nurul.com 17


Merge Sort

12/07/21 03:35 PM http://www.nurul.com 18


Merge Sort

• Although the merge step produces a sorted


array, we have overlooked a very important
step.
• How did we sort the 2 halves before
performing the merge step?

We used merge sort!

12/07/21 03:35 PM http://www.nurul.com 19


Merge Sort
• By continually calling the merge sort
algorithm, we eventually get a subarray of
size 1.
• Since an array with only 1 element is
clearly sorted, we can back out and merge 2
arrays of size 1.

12/07/21 03:35 PM http://www.nurul.com 20


Merge Sort

12/07/21 03:35 PM http://www.nurul.com 21


Merge Sort
• The basic merging algorithm consists of:
– 2 input arrays (arrayA and arrayB)
– An ouput array (arrayC)
– 3 position holders (indexA, indexB, indexC),
which are initially set to the beginning of their
respective arrays.

12/07/21 03:35 PM http://www.nurul.com 22


Merge Sort
• The smaller of arrayA[indexA] and
arrayB[indexB] is copied into
arrayC[indexC] and the appropriate position
holders are advanced.
• When either input list is exhausted, the
remainder of the other list is copied into
arrayC.

12/07/21 03:35 PM http://www.nurul.com 23


Merge Sort
1 13 24 26 We compare arrayA[indexA]
arrayA with arrayB[indexB].
indexA Whichever value is smaller is
placed into arrayC[indexC].
arrayB 2 15 27 38
1 < 2 so we insert
indexB arrayA[indexA] into
arrayC[indexC]

arrayC
indexC
12/07/21 03:35 PM http://www.nurul.com 24
Merge Sort
arrayA 1 13 24 26
2 < 13 so we insert
indexA arrayB[indexB] into
arrayC[indexC]
arrayB 2 15 27 38
indexB

arrayC
1
indexC
12/07/21 03:35 PM http://www.nurul.com 25
Merge Sort
arrayA 1 13 24 26
13 < 15 so we insert
indexA arrayA[indexA] into
arrayC[indexC]
arrayB 2 15 27 38
indexB

arrayC
1 2
indexC
12/07/21 03:35 PM http://www.nurul.com 26
Merge Sort
arrayA 1 13 24 26
15 < 24 so we insert
indexA arrayB[indexB] into
arrayC[indexC]
arrayB 2 15 27 38
indexB

arrayC
1 2 13
indexC
12/07/21 03:35 PM http://www.nurul.com 27
Merge Sort
arrayA 1 13 24 26
24 < 27 so we insert
indexA arrayA[indexA] into
arrayC[indexC]
arrayB 2 15 27 38
indexB

arrayC
1 2 13 15
indexC
12/07/21 03:35 PM http://www.nurul.com 28
Merge Sort
arrayA 1 13 24 26
26 < 27 so we insert
indexA arrayA[indexA] into
arrayC[indexC]
arrayB 2 15 27 38
indexB

arrayC
1 2 13 15 24
indexC
12/07/21 03:35 PM http://www.nurul.com 29
Merge Sort
arrayA 1 13 24 26
Since we have exhausted
one of the arrays, arrayA,
we simply copy the
arrayB 2 15 27 38 remaining items from the
other array, arrayB, into
indexB arrayC

1 2 13 15 24 26
arrayC
indexC
12/07/21 03:35 PM http://www.nurul.com 30
Merge Sort
arrayA 1 13 24 26

arrayB 2 15 27 38

arrayC
1 2 13 15 24 26 27 38

12/07/21 03:35 PM http://www.nurul.com 31


Merge Sort Pseudocode
mergesort(list, first, last) {
if( first < last )
mid = (first + last)/2;
// Sort the 1st half of the list
mergesort(list, first, mid);
// Sort the 2nd half of the list
mergesort(list, mid+1, last);
// Merge the 2 sorted halves
merge(list, first, mid, last);
end if
}
12/07/21 03:35 PM http://www.nurul.com 32
Merge Sort Pseudocode (cont)
merge(list, first, mid, last) {
// Initialize the first and last indices of our subarrays
firstA = first
lastA = mid
firstB = mid+1
lastB = last

index = firstA // Index into our temp array

12/07/21 03:35 PM http://www.nurul.com 33


Merge Sort Pseudocode (cont)
// Start the merging
loop( firstA <= lastA AND firstB <= lastB )
if( list[firstA] < list[firstB] )
tempArray[index] = list[firstA]
firstA = firstA + 1
else
tempArray[index] = list[firstB]
firstB = firstB + 1
end if
index = index + 1;
end loop
Merge Sort Pseudocode (cont)
// At this point, one of our subarrays is empty
// Now go through and copy any remaining items
// from the non-empty array into our temp array
loop (firstA <= lastA)
tempArray[index] = list[firstA]
firstA = firstA + 1
index = index + 1
end loop
loop ( firstB <= lastB )
tempArray[index] = list[firstB]
firstB = firstB + 1
index = index + 1
end loop
12/07/21 03:35 PM http://www.nurul.com 35
Merge Sort Pseudocode (cont)
// Finally, we copy our temp array back into
// our original array
index = first
loop (index <= last)
list[index] = tempArray[index]
index = index + 1
end loop
}

12/07/21 03:35 PM http://www.nurul.com 36


Evaluation
 Recurrence equation:
 Assume n is a power of 2
c1 if n=1
T(n) =
2T(n/2) + c2n if n>1,
n=2k

12/07/21 03:35 PM http://www.nurul.com 37


Solution
By Substitution:
T(n) = 2T(n/2) + c2n
T(n/2) = 2T(n/4) + c2n/2

T(n) = 4T(n/4) + 2 c2n


T(n) = 8T(n/8) + 3 c2n

T(n) = 2iT(n/2i) + ic2n

12/07/21 03:35 PM http://www.nurul.com 38


Solution (cont.)
Assuming n = 2k, expansion halts when we get
T(1) on right side; this happens when i=k
T(n) = 2kT(1) + kc2n
Since 2k=n, we know k=log n; since T(1) = c1,
we get
T(n) = c1n + c2nlogn; thus an upper bound for
TmergeSort(n) is O(nlogn)
Quick Sort
 Most efficient (general) internal sorting routine
 In essence: sort array A by picking some key value v in
the array as a pivot element
 three segments: left, pivot, right
 all elements in left are smaller than pivot, all elements in
right are larger than or equal to pivot
 sort elements in left and right, no merge required to combine
 hope that pivot is near median, so that left and right are
same size

12/07/21 03:35 PM http://www.nurul.com 40


Quick Sort
• In the bubble sort, consecutive items are compared
and possibly exchanged on each pass through the
list.
• This means that many exchanges may be needed
to move an element to its correct position.
• Quick sort is more efficient than bubble sort
because a typical exchange involves elements that
are far apart, so fewer exchanges are required to
correctly position an element.
12/07/21 03:35 PM http://www.nurul.com 41
Quick Sort
• Each iteration of the quick sort selects an
element, known as the pivot, and divides the
list into 3 groups:
– Elements whose keys are less than (or equal to)
the pivot’s key.
– The pivot element
– Elements whose keys are greater than (or equal
to) the pivot’s key.

12/07/21 03:35 PM http://www.nurul.com 42


Quick Sort
• The sorting then continues by quick
sorting the left partition followed by quick
sorting the right partition.
• The basic algorithm is as follows:

12/07/21 03:35 PM http://www.nurul.com 43


Quick Sort
• Partitioning Step: Take an element in the unsorted
array and determine its final location in the sorted
array. This occurs when all values to the left of
the element in the array are less than (or equal to)
the element, and all values to the right of the
element are greater than (or equal to) the element.
We now have 1 element in its proper location and
two unsorted subarrays.
• Recursive Step: Perform step 1 on each unsorted
subarray.
12/07/21 03:35 PM http://www.nurul.com 44
Quick Sort
• Each time step 1 is performed on a
subarray, another element is placed in its
final location of the sorted array, and two
unsorted subarrays are created.
• When a subarray consists of one element,
that subarray is sorted.
• Therefore that element is in its final
location.

12/07/21 03:35 PM http://www.nurul.com 45


Quick Sort
• There are several partitioning strategies used in
practice (i.e., several “versions” of quick sort), but
the one we are about to describe is known to work
well.
• For simplicity we will choose the last element to
be the pivot element.
• We could also chose a different pivot element and
swap it with the last element in the array.

12/07/21 03:35 PM http://www.nurul.com 46


Quick Sort
• Below is the array we would like to sort:

1 4 8 9 0 11 5 10 7 6

12/07/21 03:35 PM http://www.nurul.com 47


Quick Sort
• The index left starts at the first element and right
starts at the next-to-last element.

1 4 8 9 0 11 5 10 7 6
left right
• We want to move all the elements smaller than the
pivot to the left part of the array and all the
elements larger than the pivot to the right part.

12/07/21 03:35 PM http://www.nurul.com 48


Quick Sort
• We move left to the right, skipping over
elements that are smaller than the pivot.

1 4 8 9 0 11 5 10 7 6
left right

12/07/21 03:35 PM http://www.nurul.com 49


Quick Sort
• We then move right to the left, skipping over
elements that are greater than the pivot.

1 4 8 9 0 11 5 10 7 6
left right

• When left and right have stopped, left is on an


element greater than (or equal to) the pivot and
right is on an element smaller than (or equal to) the
pivot.
12/07/21 03:35 PM http://www.nurul.com 50
Quick Sort
• If left is to the left of right (or if left =
right), those elements are swapped.

1 4 8 9 0 11 5 10 7 6
left right

1 4 5 9 0 11 8 10 7 6
left right
Quick Sort
• The effect is to push a large element to the
right and a small element to the left.
• We then repeat the process until left and
right cross.

12/07/21 03:35 PM http://www.nurul.com 52


Quick Sort
1 4 5 9 0 11 8 10 7 6
left right

1 4 5 9 0 11 8 10 7 6
left right

1 4 5 0 9 11 8 10 7 6
left right
Quick Sort

1 4 5 0 9 11 8 10 7 6
left right

1 4 5 0 9 11 8 10 7 6
right left

12/07/21 03:35 PM http://www.nurul.com 54


Quick Sort
1 4 5 0 9 11 8 10 7 6
right left
• At this point, left and right have crossed so
no swap is performed.
• The final part of the partitioning is to swap
the pivot element with left.
1 4 5 0 6 11 8 10 7 9
right left
Quick Sort
• Note that all elements to the left of the pivot
are less than (or equal to) the pivot and all
elements to the right of the pivot are greater
than (or equal to) the pivot.
• Hence, the pivot element has been placed in
its final sorted position.

1 4 5 0 6 11 8 10 7 9
right left
Quick Sort
• We now repeat the process using the sub-
arrays to the left and right of the pivot.

1 4 5 0 6 11 8 10 7 9

1 4 5 0 11 8 10 7 9
12/07/21 03:35 PM http://www.nurul.com 57
Quick Sort Pseudocode
quicksort(list, leftMostIndex, rightMostIndex) {
if( leftMostIndex < rightMostIndex )
pivot = list[rightMostIndex]
left = leftMostIndex
right = rightMostIndex – 1
end if
loop (left <= right)
// Find key on left that belongs on right
loop (list[left] < pivot)
left = left + 1 Make sure not to
end loop run off the array
// Find key on right that belongs on left
loop (right >= left AND list[right] > pivot)
right = right – 1
end loop
Quick Sort Pseudocode (cont)
// Swap out-of-order elements
if (left <= right) // Why swap if left = right?
swap(list, left, right)
left = left + 1
right = right – 1
end if
Must account for special case of
end loop
list[left]=list[right]=pivot
// Move the pivot element to its correct location
swap(list, left, rightMostIndex)
// Continue splitting list and sorting
quickSort(list, leftMostIndex, right)
quickSort(list, left+1, rightMostIndex)
}
12/07/21 03:35 PM http://www.nurul.com 59
Quick Sort
• A couple of notes about quick sort:
• There are more optimal ways to choose the
pivot value (such as the median-of-three
method).
• Also, when the subarrays get small, it
becomes more efficient to use the insertion
sort as opposed to continued use of quick
sort.

12/07/21 03:35 PM http://www.nurul.com 60


Choosing The Pivot
 Pivot can be any value in domain, but d
oes not need to be in S
 can be average of selected elements in S
 can be random selection, but rand() is expe
nsive and adds to running time of algorithm
 Pivot usually taken as median of at least
3 elements or as middle element of S
Choosing The Pivot (cont.)
 median of 3 is less likely to be nearly the sm
allest or largest element in the list, so the m
edian is more likely to partition the list into 2
nearly equal halves
-- median = (n/2)th largest element
-- or pick 3 elements randomly and choose the me
dian; this reduces running time by about 5%
 center element is in location that divides key
s into half
 Choice determines performance time
Analysis of Quicksort
 based on choice of pivot
 many versions of analysis have been do
ne, but all lead to same results
 running time = time to do two recursive
calls + linear time in the partition

12/07/21 03:35 PM http://www.nurul.com 63


Worst Case Analysis
 occurs when pivot is smallest or largest ele
ment (one sublist has n - 1 entries and the
other is empty)
 also if list is pre-sorted or is in reverse order
: all elements go into S1 or S2
 since C(0) = 0, we have C(n) = n - 1 + C(n - 1)
 we can solve this recurrence relation by startin
g at the bottom:
Worst Case Analysis (cont.)
C(1) = 0
C(2) = 1 + C(1) = 1
C(3) = 2 + C(2) = 2 + 1
C(4) = 3 + C(3) = 3 + 2 + 1
... ...
C(n) = n - 1 + C(n - 1)
= (n - 1) + (n - 2) + ... + 2 + 1
= (n - 1)n/2
= 1/2 n2 - 1/2 n (sum of integers from 1 to n-1)
= O(n2)
Worst Case Analysis (cont.
)
 Selection sort makes approx. 1/2 n2 - 1/
n key comparisons, or O(n2), in worst c
ase. So, in its worst case, quicksort is as
bad as the worst case of selection sort.
 Number of swaps made by quicksort is
about 3 times as many as worst case of
insertion sort
Average Case Analysis
 Average behavior of quicksort, when ap
plied to lists in random order
 assume all possible orderings of S1 equally
likely, hence has probability of 1/n
 pivot is equally likely to be any element
 number of key comparisons made is n-1, a
s in worst case
Average Case Analysis(cont.)
 Let C(n) be average number of comparisons do
ne by quicksort on list of length n and C(n,p) b
e average number of comparisons on list of len
gth n where the pivot for the first partition is p.
 Recursion makes C(p-1) comparisons for the su
blist of entries less than the pivot and C(n-p) c
omparisons for the sublist of entries greater tha
n the pivot. Thus, for n>2 C(n,p) = n - 1 + C(p
- 1) + C(n - p)
Average Case Analysis (co
nt.)
 To find C(n) , take the average of these
expressions, since p is random, by addi
ng them from p=1 to p=n and dividing
by n. This yields the recurrence relation
for the number of comparisons done in
the average case:
C(n) = n - 1 + 2/n(C(0) + C(1) + ... + C(n-1))
Average Case Analysis (cont.)

 To solve: note that if sorting a list of length n


-1, we would have the same expression with
n replaced by n-1, as long as n>2:
C(n-1) = n - 2 +2/(n-1)(C(0) + C(1) + ... + C(n-2))
 Multiply the first expression by n, the second
by n-1 and subtract:
nC(n) - (n-1)C(n-1) = n(n-1) - (n-1)(n-2) + 2C(n-1)
Average Case Analysis (co
nt.)
Rearrange:
(C(n)-2)/(n+1) = (C(n-1) - 2)/n + 2/(n+1)
Now let S(n) be defined as (C(n)-2 )/(n+1
)
Then S(n) = S(n-1) + 2/(n+1)
Average Case Analysis (cont.)
 Solve by working down from n, each time r
eplacing expression by a smaller case until
we reach n=1:
S(n) = 2/(n+1) + S(n-1)
S(n-1) = 2/n + S(n-2)
S(n-2) = 2/(n-1) + S(n-3)
...
S(n) = 2/(n+1) + 2/n + 2/(n-1) ... + 2/2
...
= 2 (1/(n+1) + 1/n + 1/(n-1) + ... + 1)
=2 Hn+1
Average Case Analysis (cont.)
 Now we can write:
(C(n)-2)/(n+1) = 2 Hn+1
(C(n)-2)/(n+1) = 2 ln n + O(1)
C(n) =(n+1)*2 ln n + (n+1)*O(1)
= 2n ln n + O(n)
 In its average case, quicksort performs C(n)
= 2n ln n + O(n) comparisons of keys in sorti
ng a list of n entries
ln n = (ln 2)(lg n) and ln 2 = 0.693,
so that C(n) = 1.386n lg n + O(n) = O(n log n)
Best case Analysis
 pivot is middle element; two subfiles are each
half the size of original
T(n) = 2T(n/2) + cn
T(n)/n = T(n/2)/(n/2) + c (*divide equation by n)*)
Solve using telescoping sums:
T(n/2)/(n/2) =T(n/4)/(n/4) + c
T(n/4)/(n/4) =T(n/8)/(n/8) + c
T(n/8)/(n/8) =T(n/16)/(n/16) + c
...
Best case Analysis (cont.)
T(2)/n = T(1)/1 + c // add up all equations 2 1 // and cancel all term
s
// leaving the following // equation
Since there are log n equations ,
we have T(n)/n = T(1)/1 + c*log n
This yields: T(n) = c*n log n + n= O(n log n)
 This analysis gives an overestimate of time complexity, but is
acceptable to get O( ) and not THETA.

You might also like