Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
56 views

Algorithm Chapter Four

This document discusses dynamic programming and traversal techniques. It covers topics such as multistage graphs, all pairs shortest path problems, optimal binary search trees, the 0/1 knapsack problem, reliability design, the travelling salesman problem, game trees, disconnected components, and depth-first search. Dynamic programming breaks problems down into overlapping subproblems, solves each subproblem once, and stores the solutions to build up the optimal solution to the original problem.

Uploaded by

Oz G
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
56 views

Algorithm Chapter Four

This document discusses dynamic programming and traversal techniques. It covers topics such as multistage graphs, all pairs shortest path problems, optimal binary search trees, the 0/1 knapsack problem, reliability design, the travelling salesman problem, game trees, disconnected components, and depth-first search. Dynamic programming breaks problems down into overlapping subproblems, solves each subproblem once, and stores the solutions to build up the optimal solution to the original problem.

Uploaded by

Oz G
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 86

Chapter Four

Dynamic Programming and traversal techniques

Compiled by: Gizachew Melkamu(MSc in CS) 1


Content
✓ Multistage graphs, all pairs shortest pattern

✓ Optimal binary search trees


✓ 0/1 Knapsack
✓ Reliability design
✓ Travelling sales man problem
✓ Game trees
✓ Disconnected components
✓ Depth first search. Compiled by: Gizachew Melkamu(MSc in CS) 2
Introduction: Dynamic Programming
Dynamic programming approach is similar to divide and conquer in breaking down
the problem into smaller and yet smaller possible sub-problems. But unlike, divide and
conquer, these sub-problems are not solved independently. Rather, results of these
smaller sub-problems are remembered and used for similar or overlapping sub-
problems.
Dynamic programming is used where we have problems, which can be divided into
similar sub-problems, so that their results can be re-used. Mostly, these algorithms
are used for optimization problems. Before solving the in-hand sub-problem, dynamic
algorithm will try to examine the results of the previously solved sub-problems. The
solutions of sub-problems are combined in order to achieve the best solution. So we can
say that:-
• The problem should be able to be divided into smaller overlapping sub-problem.
• An optimum solution can be achieved by using an optimum solution of smaller sub-
problems.
• Dynamic algorithms use Memoization.
Term Dynamic Programming coined by mathematician Richard Bellman in early 1950s.
Compiled by: Gizachew Melkamu(MSc in CS) 3
Cont…
• Dynamic programming is a technique that breaks the problems into sub-
problems, and saves the result for future purposes so that we do not need to
compute the result again. The subproblems are optimized to optimize the
overall solution is known as optimal substructure property. The main use
of dynamic programming is to solve optimization problems. Here,
optimization problems mean that when we are trying to find out the
minimum or the maximum solution of a problem. The dynamic
programming guarantees to find the optimal solution of a problem if the
solution exists.
• The definition of dynamic programming says that it is a technique for
solving a complex problem by first breaking into a collection of simpler
subproblems, solving each subproblem just once, and then storing their
solutions to avoid repetitive computations.
Compiled by: Gizachew Melkamu(MSc in CS) 4
How does the dynamic programming approach work?
The following are the steps that the dynamic programming follows:

 It breaks down the complex problem into simpler subproblems.

 It finds the optimal solution to these sub-problems.

 It stores the results of subproblems (memoization). The process of


storing the results of subproblems is known as memorization.

 It reuses them so that same sub-problem is calculated more than once.

 Finally, calculate the result of the complex problem.


Compiled by: Gizachew Melkamu(MSc in CS) 5
Cont..
The above five steps are the basic steps for dynamic
programming. The dynamic programming is applicable that
are having properties such as:
Those problems that are having overlapping subproblems
and optimal substructures. Here, optimal substructure
means that the solution of optimization problems can be
obtained by simply combining the optimal solution of all
the subproblems.
In the case of dynamic programming, the space complexity
would be increased as we are storing the intermediate
results, but the time complexity would be decreased.
Compiled by: Gizachew Melkamu(MSc in CS) 6
Compiled by: Gizachew Melkamu(MSc in CS) 7
• Problems that can be solved by dynamic programming
satisfy the principle of optimality.
• The principle of optimality states that no matter whatever
the initial state and initial decision are, the remaining
decision sequence must constitute an optimal decision
sequence with regard to the state resulting from the first
decision. The principle implies that an optimal decision
sequence is comprised of optimal decision subsequences.
Since the principle of optimality may not hold for some
formulations of some problems, it is necessary to verify
that it does hold for the problem being solved. Dynamic
programming cannot be applied when this principle does
not hold.
Compiled by: Gizachew Melkamu(MSc in CS) 8
Approaches of dynamic programming
•There are two approaches to dynamic programming:-
Top-down approach and Bottom-up approach
Top-down approach
The top-down approach follows the memoization technique, while bottom-up
approach follows the tabulation method. Here memorization is equal to the sum of
recursion and caching. Recursion means calling the function itself, while caching
means storing the intermediate results.
Advantages
•It is very easy to understand and implement.
•It solves the subproblems only when it is required.
•It is easy to debug.
Disadvantages
It uses the recursion technique that occupies more memory in the call stack.
Sometimes when the recursion is too deep, the stack overflow condition will occur.
It occupies more memory that degrades the overall performance.
Compiled by: Gizachew Melkamu(MSc in CS) 9
Comparison
In contrast to greedy algorithms, where local optimization is addressed, dynamic
algorithms are motivated for an overall optimization of the problem.
In contrast to divide and conquer algorithms, where solutions are combined to
achieve an overall solution, dynamic algorithms use the output of a smaller sub-
problem and then try to optimize a bigger sub-problem. Dynamic algorithms use
Memoization to remember the output of already solved sub-problems.
Example:-
The following computer problems can be solved using dynamic programming
approach.
 Fibonacci number series - Knapsack problem
 All pair shortest path by Floyd-Warshall - Tower of Hanoi
 Shortest path by Dijkstra - Project scheduling
Dynamic programming can be used in both top-down and bottom-up manner. And of
course, most of the times, referring to the previous solution output is cheaper than
recomputing in terms of CPU cycles.
Compiled by: Gizachew Melkamu(MSc in CS) 10
Compiled by: Gizachew Melkamu(MSc in CS) 11
Compiled by: Gizachew Melkamu(MSc in CS) 12
Compiled by: Gizachew Melkamu(MSc in CS) 13
Compiled by: Gizachew Melkamu(MSc in CS) 14
Multistage graphs

Compiled by: Gizachew Melkamu(MSc in CS) 15


Approaches to Multistage Graph
• Forward Approach
• Backward Approach

1. Forward Approach

2. Backward Approach

Compiled by: Gizachew Melkamu(MSc in CS) 16


Algorithm for forward approach

Compiled by: Gizachew Melkamu(MSc in CS) 17


Algorithm for backward approach

Compiled by: Gizachew Melkamu(MSc in CS) 18


Example:

Compiled by: Gizachew Melkamu(MSc in CS) 19


Using forward approach

Compiled by: Gizachew Melkamu(MSc in CS) 20


Using backward approach

Compiled by: Gizachew Melkamu(MSc in CS) 21


All pairs shortest path(Floyd Warshall)
• The Floyd Warshall Algorithm is for solving the All Pairs Shortest Path problem. The
problem is to find shortest distances/path between every pair of vertices in a given
edge weighted directed Graph.
• Step1: Create a matrix A0 of dimension n*n where n is the number of vertices. The
row and the column are indexed as i and j respectively. i and j are the vertices of
the graph.
Each cell A[i][j] is filled with the distance from the ith vertex to the jth vertex. If
there is no path from ith vertex to jth vertex, the cell is left as infinity.
• Step 2:
• create a matrix A1 using matrix A0. The elements in the first column and the first
row are left as they are. The remaining cells are filled in the following way.
Let k be the intermediate vertex in the shortest path from source to destination. In
this step, k is the first vertex. A[i][j] is filled with (A[i][k] + A[k][j]) if (A[i][j] > A[i][k]
+ A[k][j]).
• Step 3: repeat step2 for all vertices.
Compiled by: Gizachew Melkamu(MSc in CS) 22
Compiled by: Gizachew Melkamu(MSc in CS) 23
Floyd Warshall’s Algorithm
n = no of vertices
A = matrix of dimension n*n
for k = 1 to n
for i = 1 to n
for j = 1 to n
Ak[i, j] = min ( Ak-1[i, j], Ak-1[i, k] + Ak-1[k, j] )
return A

Compiled by: Gizachew Melkamu(MSc in CS) 24


Optimal binary search trees
Binary tree is a tree data structure in which each node has at most
two children, which are referred to as the left child and the right
child.
The properties that separate a binary search tree from a
regular binary tree is :
• The left subtree of a node contains only nodes with keys lesser
than the node’s key.
• The right subtree of a node contains only nodes with keys
greater than the node’s key.
• The left and right subtree each must also be a binary search
tree.
Compiled by: Gizachew Melkamu(MSc in CS) 25
Cont…
• What is an optimal binary search tree? An optimal binary search tree is a
tree of optimal cost.
• The frequency and key-value determine the overall cost of searching a node.
The cost of searching is a very important factor in various applications. The
overall cost of searching a node should be less. The time required to search a
node in BST is more than the balanced binary search tree as a balanced
binary search tree contains a lesser number of levels than the BST. The
maximum time required to search a node is equal to the minimum height of
the tree, equal to log n. There is one way that can reduce the cost of a binary
search tree is known as an optimal binary search tree.
Compiled by: Gizachew Melkamu(MSc in CS) 26
Cont…
Given sequence K = k1 < k2 <··· < kn of n sorted keys, with a search
probability pi for each key ki. The aim is to build a binary search tree with
minimum expected cost.
One way is to use brute force method, by exploring all possible ways and
finding the expected cost. But the method is not practical as the number of
trees possible is Catalan sequence. The Catalan number is given as follows:

2𝑛 1
OR C(n)= 𝑛
∗ for n>0,c(0)=1.
𝑛+1

6 1
E.g. If number of nodes/keys are 3. c(3)= 3
* =5.
3+1

Compiled by: Gizachew Melkamu(MSc in CS) 27


Compiled by: Gizachew Melkamu(MSc in CS) 28
Cont…
• Till now, we read about the height-balanced binary search tree. To find the
optimal binary search tree, we will determine the frequency of searching a
key. Let's assume that frequencies associated with the keys 10, 20, 30 are 3,
2, 5.

29
Compiled by: Gizachew Melkamu(MSc in CS)
Dynamic Approach
Consider the below table, which contains the keys and frequencies.

First construct table like this one:

Compiled by: Gizachew Melkamu(MSc in CS) 30


Cont…
• Next, calculate the values where j-i is equal to zero, one, two, three and
four. i.e. when 0 key considered, 1 key considered , 2 keys considered,
3 keys considered and 4 keys considered.
First zero key considered.
• When i=0, j=0, then j-i = 0
• When i = 1, j=1, then j-i = 0
• When i = 2, j=2, then j-i = 0
• When i = 3, j=3, then j-i = 0
• When i = 4, j=4, then j-i = 0
• Therefore, c[0, 0] = 0, c[1 , 1] = 0, c[2,2] = 0, c[3,3] = 0, c[4,4] = 0

Compiled by: Gizachew Melkamu(MSc in CS) 31


Second when j-i=1
• When j=1, i=0 then j-i = 1
• When j=2, i=1 then j-i = 1
• When j=3, i=2 then j-i = 1
• When j=4, i=3 then j-i = 1
• Now to calculate the cost, we will consider only the jth value.
• The cost of c[0,1] is 4 (The key is 10, and the cost corresponding to key 10 is 4).
• The cost of c[1,2] is 2 (The key is 20, and the cost corresponding to key 20 is 2).
• The cost of c[2,3] is 6 (The key is 30, and the cost corresponding to key 30 is 6)
• The cost of c[3,4] is 3 (The key is 40, and the cost corresponding to key 40 is 3)

Compiled by: Gizachew Melkamu(MSc in CS) 32


Then fill the table like below:

Compiled by: Gizachew Melkamu(MSc in CS) 33


Cont…
Third when j-i=2, or two keys considered:
• When j=2, i=0 then j-i = 2
• When j=3, i=1 then j-i = 2
• When j=4, i=2 then j-i = 2
When i=0 and j=2: then keys 10 and 20.
• In the first binary tree, cost would be: 4*1 + 2*2 = 8
• In the second binary tree, cost would be: 4*2 + 2*1 = 10
• The minimum cost is 8; therefore, c[0,2] = 8

Compiled by: Gizachew Melkamu(MSc in CS) 34


Cont…
When i=1 and j=3, then keys 20 and 30.
• In the first binary tree, cost would be: 1*2 + 2*6 = 14
• In the second binary tree, cost would be: 1*6 + 2*2 = 10
• The minimum cost is 10; therefore, c[1,3] = 10
When i=2 and j=4, we will consider the keys at 3 and 4, i.e., 30 and 40.
• In the first binary tree, cost would be: 1*6 + 2*3 = 12
• In the second binary tree, cost would be: 1*3 + 2*6 = 15
• The minimum cost is 12, therefore, c[2,4] = 12

Compiled by: Gizachew Melkamu(MSc in CS) 35


After completing the third step fill the table.

Compiled by: Gizachew Melkamu(MSc in CS) 36


Forth, when j-i=3 or three keys are considered:
• When j=3, i=0 then j-i = 3
• When j=4, i=1 then j-i = 3
i) When i=0, j=3 then we will consider three keys, i.e., 10, 20, and 30.
• If 10 is considered as a root node:
 20 is the right child of node 10, and 30 is the right child of node 20.
Cost would be: 1*4 + 2*2 + 3*6 = 26
 30 is the right child of node 10, and 20 is the left child of node 20.
Cost would be: 1*4 + 2*6 + 3*2 = 22
• If 20 is considered as the root node. 30 is the right child of node 20, and 10
is the left child of node 20.
Cost would be: 1*2 + 4*2 + 6*2 = 22
Compiled by: Gizachew Melkamu(MSc in CS) 37
Cont…
• If 30 is considered as the root node.
 if 20 is the left child of node 30, and 10 is the left child of node 20.
Cost would be: 1*6 + 2*2 + 3*4 = 22
 If 10 is the left child of node 30 and 20 is the right child of node 10.
Cost would be: 1*6 + 2*4 + 3*2 = 20
Therefore, the minimum cost is 20 which is the 3rd root. So, c[0,3] is equal
to 20.

Compiled by: Gizachew Melkamu(MSc in CS) 38


Cont…
ii) When i=1 and j=4 then we will consider the keys 20, 30, 40.
• c[1,4] = min{ c[1,1] + c[2,4], c[1,2] + c[3,4], c[1,3] + c[4,4] } + 11
= min{0+12, 2+3, 10+0}+ 11
= min{12, 5, 10} + 11
The minimum value is 5;
therefore, c[1,4] = 5+11 = 16

• Using General formula


for calculating the minimum cost is:
• C[i,j] = min{c[i, k-1] + c[k,j]} + w(i,j)

Compiled by: Gizachew Melkamu(MSc in CS) 39


Fifth, when j-i=4, we will consider four keys, i.e.,10, 20, 30 and 40
When j=4 and i=0 then j-i = 4
• w[0, 4] = 4 + 2 + 6 + 3 = 15
• If we consider 10 as the root node then
C[0, 4] = min {c[0,0] + c[1,4]}+ w[0,4]
= min {0 + 16} + 15= 31
• If we consider 20 as the root node then
C[0,4] = min{c[0,1] + c[2,4]} + w[0,4]
= min{4 + 12} + 15
= 16 + 15 = 31 Compiled by: Gizachew Melkamu(MSc in CS) 40
Cont…
• If we consider 30 as the root node then,
C[0,4] = min{c[0,2] + c[3,4]} +w[0,4]
= min {8 + 3} + 15
= 26
• If we consider 40 as the root node then,
C[0,4] = min{c[0,3] + c[4,4]} + w[0,4]
= min{20 + 0} + 15
= 35
->the minimum cost is 26; therefore, c[0,4] is equal to 26.
Compiled by: Gizachew Melkamu(MSc in CS) 41
Finally the table is as follows with OBST: using successful search

Compiled by: Gizachew Melkamu(MSc in CS) 42


Cont…

Compiled by: Gizachew Melkamu(MSc in CS) 43


Compiled by: Gizachew Melkamu(MSc in CS) 44
Compiled by: Gizachew Melkamu(MSc in CS) 45
Compiled by: Gizachew Melkamu(MSc in CS) 46
Compiled by: Gizachew Melkamu(MSc in CS) 47
Compiled by: Gizachew Melkamu(MSc in CS) 48
Compiled by: Gizachew Melkamu(MSc in CS) 49
Compiled by: Gizachew Melkamu(MSc in CS) 50
Compiled by: Gizachew Melkamu(MSc in CS) 51
Cont..

Compiled by: Gizachew Melkamu(MSc in CS) 52


Compiled by: Gizachew Melkamu(MSc in CS) 53
Compiled by: Gizachew Melkamu(MSc in CS) 54
Compiled by: Gizachew Melkamu(MSc in CS) 55
Compiled by: Gizachew Melkamu(MSc in CS) 56
Compiled by: Gizachew Melkamu(MSc in CS) 57
58
Compiled by: Gizachew Melkamu(MSc in CS)
Compiled by: Gizachew Melkamu(MSc in CS) 59
Compiled by: Gizachew Melkamu(MSc in CS) 60
Compiled by: Gizachew Melkamu(MSc in CS) 61
Compiled by: Gizachew Melkamu(MSc in CS) 62
Compiled by: Gizachew Melkamu(MSc in CS) 63
Compiled by: Gizachew Melkamu(MSc in CS) 64
Compiled by: Gizachew Melkamu(MSc in CS) 65
66
Compiled by: Gizachew Melkamu(MSc in CS)
The 0/1 Knapsack Problem
Given: A set S of n items, with each item i having
• wi - a positive weight value
• bi - a positive benefit/profit value
Goal: Choose items with maximum total benefit but with weight at most W.
• If we are not allowed to take fractional amounts, then this is the 0/1
knapsack problem.
• In this case, we let T denote the set of items we take

• Objective: maximize b
iT
i

• Constraint:
w
iT
i W
Compiled by: Gizachew Melkamu(MSc in CS) 67
Cont…

Compiled by: Gizachew Melkamu(MSc in CS) 68


Compiled by: Gizachew Melkamu(MSc in CS) 69
Compiled by: Gizachew Melkamu(MSc in CS) 70
Reliability design

71
Compiled by: Gizachew Melkamu(MSc in CS)
Compiled by: Gizachew Melkamu(MSc in CS) 72
Cont…
• Thus if ri =0.80 and mi = 2, the stage reliability becomes 0.96.
Reliability = 1- (1 - 0.80)2
= 1- (0.2)2
= 1 – 0.04
= 0.96 The reliability of this stage increased with duplication.

Compiled by: Gizachew Melkamu(MSc in CS) 73


Cont…

Compiled by: Gizachew Melkamu(MSc in CS) 74


Cont…

Compiled by: Gizachew Melkamu(MSc in CS) 75


Cont…

Compiled by: Gizachew Melkamu(MSc in CS) 76


Cont…

Compiled by: Gizachew Melkamu(MSc in CS) 77


Cont…

Compiled by: Gizachew Melkamu(MSc in CS) 78


Cont….

79
Compiled by: Gizachew Melkamu(MSc in CS)
Cont..

Compiled by: Gizachew Melkamu(MSc in CS) 80


Cont…

Compiled by: Gizachew Melkamu(MSc in CS) 81


Cont…

Compiled by: Gizachew Melkamu(MSc in CS) 82


Cont…

Compiled by: Gizachew Melkamu(MSc in CS) 83


Cont…

Compiled by: Gizachew Melkamu(MSc in CS) 84


Cont…

Compiled by: Gizachew Melkamu(MSc in CS) 85


Compiled by: Gizachew Melkamu(MSc in CS) 86

You might also like