Modified Dynamic
Modified Dynamic
Programming
BASICS OF DYNAMIC PROGRAMMING
Useful for solving multistage optimization problems.
An optimization problem deals with the maximization or minimization
of the objective function as per problem requirements.
In multistage optimization problems, decisions are made at successive
stages to obtain a global solution for the given problem.
Dynamic programming divides a problem into subproblems and
establishes a recursive relationship between the original problem and
its subproblems.
BASICS OF DYNAMIC PROGRAMMING-
A subproblem representing a small part of the original problem is
solved to obtain the optimal solution.
Then the scope of this subproblem is enlarged to find the optimal
solution for a new subproblem.
After that, the path length increases, resulting in a total path length of
1002.
However, dynamic programming would not make such a mistake, as it
treats the problem in stages.
For this problem, there are three stages with vertices {s}, {1, 2}, and {t}.
The problem would find the shortest path from stage 2, {1, 2}, to {t} first,
then calculate the final path. Hence, dynamic programming would result
in a path s to 1 and from 1 to t.
Dynamic programming problems, therefore, yield a globally optimal
result compared to the greedy approach, whose solutions are just locally
optimal.
Dynamic programming approach vs greedy approach
(a sequence of decisions) has the property that whatever the initial state and
decision are, the remaining decisions must constitute an optimal policy
The best way to solve this problem is to use the dynamic programming
approach, in which the results of the intermediate problems are stored.
Consequently, results of the previously computed subproblems can be used
instead of recomputing the subproblems repeatedly. Thus, a subproblem is
computed only once, and the exponential algorithm is reduced to a
polynomial algorithm.
A table is created to store the intermediate results, a table is created, and
its values are reused.
Bottom-up approach
An iterative loop can be used to modify this Fibonacci number computation
effectively, and the resulting dynamic programming algorithm can be used to
compute the nth Fibonacci number.
Step 3: Compute a function g(i, S), a function that gives the length of the shortest path starting from
vertex i, traversing through all the vertices in set S, and terminating at vertex i as follows:
3a:
compute recursively
and store the value.
Step 4: Compute the minimum cost of the travelling salesperson tour as follows: Compute
computed in Step 3.
Step 5: Return the value of Step 4.
Step 6: End.
Travelling salesman problem
Solve the TSP for the graph shown in Fig. 13.17 using dynamic programming.
cost(2, ) = d[2, 1] = 2; cost(3, ) = d[3, 1] = 4; cost(4, ) = d[4, 1] = 6
This indicates the distance from vertices 2, 3, and 4 to vertex 1. When |S| = 1, cost(i, j) function
can be solved using the recursive function as follows:
cost(2,{3,4}) = min{d[2, 3] + cost(3, {4}), d[2, 4] + cost(4, {3})} = min{5 + 14 , 7 + 13} = min{19, 20} = 19
cost(3,{2,4}) = min{d[3, 2] + cost(2, {4}), dist[3, 4]+cost(4, {2})} = min{3 + 13, 8 + 7} = min{16, 15} = 15
cost(4,{2,3}) = min{d[4, 2] + cost(2, {3}), d[4, 3] + cost(3, {2})} = min{5 + 9, 9 + 5} = min{14, 14} = 14
Finally, the total cost of the tour is calculated, which involves three intermediate nodes, that is, |S| = 3. As |S|
min dist
[cost(1, {2, 3, 4})] is computed as follows:
cost(1,{2, 3, 4}) = min{d[1, 2] + cost(2,{3, 4}), d[1, 3] + cost(3, {2, 4}),d[1, 4] + cost(4, {2, 3})} = min{5 + 19, 3 +
15, 10 + 14} = min {24,18, 24} = 18
Hence, the minimum cost tour is 18. Therefore, P(1, {2, 3, 4}) = 3, as the minimum cost obtained is only 18. Thus,
Hence, the
Chain Matrix Multiplication
Step 1: Read n chain of matrices with dimensions
Step 2: Let i and j represent the start and end matrices, and let M[i, j]
represent the minimum matrix multiplication required to compute .
Step 3: Compute recursively.
3a: If (i = j), then M[i, j] = 0
else
if (i < j), then
split the matrix A as A = A × A Where, 1
compute
Step 4: Return M[1, n] as the minimum cost of chain matrix multiplication.
Chain Matrix Multiplication
The extraction of the optimal order is explained as follows:
Step 1: Read the trace matrix R with minimum k, yielding the minimum
cost.
Step 2: Perform recursive call as follows:
If ( ), then
k = R[i, j]
return
else
return(R(i, j)
Step 3: End.
Chain Matrix Multiplication
Chain Matrix Multiplication
Consider the following four matrices whose orders are given and
perform chain matrix multiplication using the dynamic programming
approach:
Knapsack Problem
Step 1: Let n be the number of objects.
Step 2: Let W be the capacity of the knapsack.
Step 3: Construct the matrix V [i, j] that stores items and their weights. Index i
tracts the items added to the knapsack (i.e., 1 to n), and index j tracks the weights
(i.e., 0 to W)
Step 4:
knapsack.
Step 5: Recursively compute the following steps:
5a: Leave the object i if or . This leaves the knapsack
with the items with profit
5b: Add the item i if or . In this case, the addition of the
items results in a profit max Here is the
profit of the current item.
Step 6: Return the maximum profit of adding feasible items to the knapsack, V
Step 7: End.
Knapsack Problem
Knapsack Problem
Apply the dynamic programming algorithm to the instance of the
knapsack problem shown in Table 13.28. Assume that the knapsack
capacity is w = 3.
Computation of first row:
C2,2 = p2 = 1/ 7
C3,3 = p3 = 3 /7
C4,4 = p4 = 1/ 7
C[1,2]=min
C[2,3]=min
= min{0 +3/7+ (1/7+ 3/7), 1/7 + 0 + (2/7+ 3/7)} = min{7/7, 5/7} = 6/7
The minimum is when k = 3.
C [3, 4] can be computed as follows (i = 3; j = 4; k = 3, 4):
C[3,4]=min
=min{0+5/7+(2/7+1/7+3/7),2/7+3/7+(2/7+1/7+3/7),4/7+0+(2/7+1/7+3/7)}
= min{11/7,11/7, 10/7} = 10/7
The minimum is when k = 3.
Similarly, C[2, 4] can be computed as follows (i = 2; j = 4; k = 2, 3, 4):
C[2,4]=min
= min{5/7 + 0 + (1/7 + 3/7+ 1/7), 1/7 + 1/7 + (1/7 + 3/7 + 1/7), 6/7 + 0 + (1/7 + 3/7+ 1/7)}
= min{10/7,7/7, 11/7} = 7/7
The minimum is when k = 2.
Finally, the last entry C[1, 4] should be computed with i = 1; j = 4; k = 1, 2, 3, 4:
C[2,4]=min
= min{0 + 7/7+ (7/7), 2/7 + 5/7+ (7/7),4/7 + 1/7+ (7/7), 10/7 + 1/7 +(7/7)
= min{14/7, 14/7, 12/7, 18/7} = 12/7
The minimum is when k = 3.
It can be observed that the minimum cost is 13/7
1 × p3 + 2 ×( p1 + p4) + 3 × p2
= 3/7 + 6/7 + 3/7 = 12/7
FLOW-SHOP SCHEDULING PROBLEM
Single-machine Sequencing Problem
Finishing time or completion time The finishing time fi(s) of job i is the
time by which all the tasks are completed in schedule s. The finish time F(s)
of s is defined as the maximum finish time of the jobs.