Algorithms Design & Analysis - Unit-3
Algorithms Design & Analysis - Unit-3
of Technology, Gorakhpur
14-06-2021 Side 1
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
UNIT – III
Backtracking, Branch and Bound with examples such as Travelling Salesman
Problem, Graph Coloring, n-Queen Problem, Hamiltonian Cycles and Sum of
subsets, Amortized Analysis. Advanced Data Structure: Red Black Trees,
Augmenting Data Structure, B-Tree, Binomial Heap, Fibonacci Heap, and Data
Structure for Disjoint Sets, priority Queues, mergeable heaps, concatenable
queues
14-06-2021 Side 2
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
BACKTRACKING
• Backtracking is a technique used to solve problems with a large search
space, by systematically trying and eliminating possibilities.
• The principle idea of back-tracking is to construct solutions as component
at a time. And then evaluate such partially constructed solutions.
• Finds all(or some) solutions to some computational problems, notably
constraint satisfaction problem.
• Incrementally builds candidates to the solutions, and abandons a candidate
(“backtracks”) as soon as it determines the candidate will not lead to a
valid solution.
• Can be applied only for problems which admit the concept of a “partial
candidate solution” and a relatively quick test for completeness to a valid
solution.
14-06-2021 Side 3
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
BACKTRACKING
14-06-2021 Side 4
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
14-06-2021 Side 5
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
14-06-2021 Side 6
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
14-06-2021 Side 7
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
• Branch and Bound is a state space search method in which all the
children of a node are generated before expanding any of its children.
• Starting by considering the root node and applying a lower-bounding
and upper bounding procedure to it
• If the bounds match, then an optimal solution has been found and the
algorithm is finished
• If they do not match, then algorithm runs on the child nodes
14-06-2021 Side 8
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
14-06-2021 Side 9
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
14-06-2021 Side 10
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
14-06-2021 Side 12
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
GRAPH COLORING
• Graph coloring is the procedure of assignment of colors to each vertex of
a graph G such that no adjacent vertices get same color. ... The smallest
number of colors required to color a graph G is called
its chromatic number of that graph
• Method to Color a Graph
The steps required to color a graph G with n number of vertices are as follows
−
• Step 1 − Arrange the vertices of the graph in some order.
• Step 2 − Choose the first vertex and color it with the first color.
• Step 3 − Choose the next vertex and color it with the lowest numbered
color that has not been colored on any vertices adjacent to it. If all the
adjacent vertices are colored with this color, assign a new color to it. Repeat
this step until all the vertices are colored.
14-06-2021 Side 13
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
GRAPH COLORING
14-06-2021 Side 14
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
Map Coloring
• Let G be a graph and m be a given positive integer. We want to
discover whether the nodes of G can be colored in such a way that no
two adjacent node have the same color yet only m colors are used.
This technique is broadly used in “map-coloring”; Four-color map is
the main objective.
• Consider the following map and it can be easily decomposed into
the following planner graph beside it :
14-06-2021 Side 15
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
Graph Coloring
14-06-2021 Side 16
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
Graph Coloring
14-06-2021 Side 17
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
Graph Coloring
(a) (b)
(c)
14-06-2021 Side 18
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
N-QUEEN PROBLEM
14-06-2021 Side 19
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
4-QUEEN PROBLEM
•Case 1: For example to explain the n- Queen problem we
Consider n=4 using a 4- by-4 chessboard where 4-Queens
have to be placed in such a way so that no two queen can
attack each other.
1 2 3 4
1
2
3
4
14-06-2021 Side 20
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
Q
Q
5
1
Q Q x x x Q
x x Q Q Q
2 3 6
Q x x Q
x x x x Q Q
Q 4 Q 7
x x x x x x Q
Q
Q 8
Q
14-06-2021 Side 21
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
4-QUEEN PROBLEM
•Using this above mechanism we can obtain two solutions shown in
the two consecutive figures:-
1 2 3 4
1
Q Queen-1
2 Q Queen-2
3 Q Queen-3
4 Q Queen-4
4-QUEEN PROBLEM
1 2 3 4
1
Q Queen-1
2 Q Queen-2
3 Q Queen-3
4 Q Queen-4
14-06-2021 Side 23
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
14-06-2021 Side 24
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
14-06-2021 Side 25
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
(a)
(b)
Figure: • (a) Graph.
• (b) State-space tree for finding a Hamiltonian circuit. The
numbers above the nodes of the tree indicate the order in
which nodes are generated.
14-06-2021 Side 26
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
14-06-2021 Side 28
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
Subset-sum Problem
14-06-2021 Side 29
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
0 0
with w/o 3
3
3 0
3
with w/o 5 with w/o 5
5 5
8 3 5 0
5
14 8 9 3 11 5
6
7 15 8
solution 8<15
x
14-06-2021 Side 30
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
14-06-2021 Side 31
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
14-06-2021 Side 32
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
AMORTIZED ANALYSIS
14-06-2021 Side 33
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
14-06-2021 Side 34
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
Amortized Analysis
Example: Stack operations:
Push(S, x): pushes object x onto stack S
Pop(S): pops the top of the stack S and returns the
poped object
Multipop(S, k): Removes the k top objects of stack S
The action of Multipop on a stack S is as follows:
Multipop (S, k)
while not STACK-EMPTY(S) and k ≠ 0 do POP(s)
k k –1
14-06-2021 Side 35
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
Aggregate Analysis
• Using the aggregate method of amortized analysis, we can obtain
a tighter upper bound that considers the entire sequence of n
operations
• In fact, although a single Multipop operation can be expensive,
any sequence of n Push, Pop, and Multipop operations on an
initially empty stack can cost at most O(n).
• Because each object can be poped at most once for each time it is
pushed. Therefore, the number of times that Pop can be called on
a nonempty stack, including calls within Multipop, is at most the
number of Push, which is at most n. For any value of n, any
sequence of n Push, Pop, and Multipop operations takes a total of
O(n) time.
• The amortized cost of an operation is the average: O(n)/n = O(1).
14-06-2021 Side 36
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
14-06-2021 Side 37
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
14-06-2021 Side 39
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
B = {3, 4}
3
14-06-2021 Side 41
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
B = {3, 4} A = {1,2}
3 1
4 2
Want to do A union B
1
We have:
2
3
4
14-06-2021 Side 43
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
c cf cf
h e d c d
14-06-2021 Side 44
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
QUEUE
14-06-2021 Side 45
45
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
Simple Queue
14-06-2021 Side 46
46
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
Priority Queue
14-06-2021 Side 47
47
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
Priority Queue
14-06-2021 Side 48
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
Priority Queue
14-06-2021 Side 49
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
Concatenable Queue
• from 2-3 trees we know how insert(), delete(), min()
operation can be executed on 2-3 trees with n nodes in at
most O(log n) complexity
• in concatenable queue, we will show how instruction
concatenate() and split() can be executed in O(log n)
time
• It supports insert, delete, find, concatenate, and split
operations that take O(log n) time.
14-06-2021 Side 50
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
Concatenable Queue
Concatenable Queue
Concatenate(S1,S2)
• takes two input sequences s1 and s2 such that every
element of s1 is less than every element of s2 and
produce new concatenated sequence of S1S2...
• if S1 is represented as a 2-3 tree T1 and S2 as a 2-3 tree
T2, then we want to combine T1 and T2 into a single 2-3
tree T having as leaves of as the leaves of T1 in their
original order followed by the leaves of T2 in their
original order
14-06-2021 Side 52
Madan Mohan Malaviya Univ. of Technology, Gorakhpur
Concatenable Queue
SPLIT operation
• Split(a,S)--- split S set into two partition S1 and S2
• S1 is from starting of S to a
• S2 is from a to end of S Set
• i mean to split a 2-3 tree T into two 2-3 trees T1 and T2
such that all leaves in T1 have less than or equal to a and
all leaves in T2 have leaves greater than a
14-06-2021 Side 53
B- TREE
• B-Tree is a self-balancing search tree. In most of the other self-balancing search
trees (like AVL and Red-Black Trees), it is assumed that everything is in main
memory.
• To understand the use of B-Trees, we must think of the huge amount of data
that cannot fit in main memory. When the number of keys is high, the data is
read from disk in the form of blocks. Disk access time is very high compared to
the main memory access time. The main idea of using B-Trees is to reduce the
number of disk accesses.
• Most of the tree operations (search, insert, delete, max, min, ..etc ) require O(h)
disk accesses where h is the height of the tree. B-tree is a fat tree. The height of
B-Trees is kept low by putting maximum possible keys in a B-Tree node.
• Generally, the B-Tree node size is kept equal to the disk block size. Since the
height of the B-tree is low so total disk accesses for most of the operations are
reduced significantly compared to balanced Binary Search Trees like AVL
Tree, Red-Black Tree, ..etc.
B- TREE
• Time Complexity of B-Tree:
1. Search O(log n)
2. Insert O(log n)
3. Delete O(log n)
PROPERTIES OF B- TREE
1.All leaves are at the same level.
2.A B-Tree is defined by the term minimum degree ‘t’. The value of t
depends upon disk block size.
3.Every node except root must contain at least (ceiling)([t-1]/2) keys.
The root may contain minimum 1 key.
4.All nodes (including root) may contain at most t – 1 keys.
5.Number of children of a node is equal to the number of keys in it plus
1.
6.All keys of a node are sorted in increasing order. The child between
two keys k1 and k2 contains all keys in the range from k1 and k2.
7.B-Tree grows and shrinks from the root which is unlike Binary Search
Tree. Binary Search Trees grow downward and also shrink from
downward.
8.Like other balanced Binary Search Trees, time complexity to search,
insert and delete is O(log n).
B-TREE
• Following is an example of B-Tree of minimum order 5. Note that in practical
B-Trees, the value of the minimum order is much more than 5.
• We can see in the below diagram that all the leaf nodes are at the same
level and all non-leaf have no empty sub-tree and have keys one less than
the number of their children.
IMPORTANT FACT
• Traversal in B-Tree: Traversal is also similar to
Inorder traversal of Binary Tree. We start from
the leftmost child, recursively print the
leftmost child, then repeat the same process
for remaining children and keys. In the end,
recursively print the rightmost child.
B- TREE
• Search Operation in B-Tree:
Search is similar to the search in Binary Search Tree. Let the key to
be searched be k. We start from the root and recursively traverse
down. For every visited non-leaf node, if the node has the key, we
simply return the node. Otherwise, we recur down to the
appropriate child (The child which is just before the first greater
key) of the node. If we reach a leaf node and don’t find k in the
leaf node, we return NULL.
• Logic:
Searching a B-Tree is similar to searching a binary tree. The
algorithm is similar and goes with recursion. At each level, the
search is optimised as if the key value is not present in the range
of parent then the key is present in another branch. As these
values limit the search they are also known as limiting value or
separation value. If we reach a leaf node and don’t find the
desired key then it will display NULL.
Example: Searching 120 in the given
B-Tree.
EXAMPLE
EXAMPLE
EXAMPLE
EXAMPLE
• In this example, we can see that our search was
reduced by just limiting the chances where the key
containing the value could be present. Similarly if
within the above example we’ve to look for 180, then
the control will stop at step 2 because the program
will find that the key 180 is present within the current
node. And similarly, if it’s to seek out 90 then as 90 <
100 so it’ll go to the left subtree automatically and
therefore the control flow will go similarly as shown
within the above example.
Binomial Heap
• The main application of Binary Heap is as implement priority
queue. Binomial Heap is an extension of Binary Heap that provides
faster union or merge operation together with other operations
provided by Binary Heap.
• A Binomial Heap is a collection of Binomial Trees
• What is a Binomial Tree?
A Binomial Tree of order 0 has 1 node. A Binomial Tree of order k
can be constructed by taking two binomial trees of order k-1 and
making one as leftmost child or other.
A Binomial Tree of order k has following properties.
a) It has exactly 2k nodes.
b) It has depth as k.
c) There are exactly kCi nodes at depth i for i = 0, 1, . . . , k.
d) The root has degree k and children of root are themselves
Binomial Trees with order k-1, k-2,.. 0 from left to right
Binomial Heap
• Binomial Heap:
A Binomial Heap is a set of Binomial Trees where each
Binomial Tree follows Min Heap property. And there can be
at most one Binomial Tree of any degree.
• Binary Representation of a number and Binomial Heaps
A Binomial Heap with n nodes has the number of Binomial
Trees equal to the number of set bits in the Binary
representation of n. For example let n be 13, there 3 set bits
in the binary representation of n (00001101), hence 3
Binomial Trees. We can also relate the degree of these
Binomial Trees with positions of set bits. With this relation,
we can conclude that there are O(Logn) Binomial Trees in a
Binomial Heap with ‘n’ nodes.
Operations of Binomial Heap:
• The main operation in Binomial Heap is union(), all other operations mainly use this
operation. The union() operation is to combine two Binomial Heaps into one. Let us first
discuss other operations, we will discuss union later.
• insert(H, k): Inserts a key ‘k’ to Binomial Heap ‘H’. This operation first creates a Binomial
Heap with single key ‘k’, then calls union on H and the new Binomial heap.
• getMin(H): A simple way to getMin() is to traverse the list of root of Binomial Trees and
return the minimum key. This implementation requires O(Logn) time. It can be optimized to
O(1) by maintaining a pointer to minimum key root.
• extractMin(H): This operation also uses union(). We first call getMin() to find the minimum
key Binomial Tree, then we remove the node and create a new Binomial Heap by connecting
all subtrees of the removed minimum node. Finally, we call union() on H and the newly
created Binomial Heap. This operation requires O(Logn) time.
• delete(H): Like Binary Heap, delete operation first reduces the key to minus infinite, then calls
extractMin().
• decreaseKey(H): decreaseKey() is also similar to Binary Heap. We compare the decreases key
with it parent and if parent’s key is more, we swap keys and recur for the parent. We stop
when we either reach a node whose parent has a smaller key or we hit the root node. Time
complexity of decreaseKey() is O(Logn).
FIBONACCI HEAP
• A Fibonacci heap is a collection of trees satisfying
the minimum-heap property, that is, the key of a
child is always greater than or equal to the key of
the parent. This implies that the minimum key is
always at the root of one of the trees. Compared
with binomial heaps, the structure of a Fibonacci
heap is more flexible. The trees do not have a
prescribed shape and in the extreme case the
heap can have every element in a separate tree.
FIBONACCI HEAP
Facts about Fibonacci Heap
• The reduced time complexity of Decrease-Key has importance in
Dijkstra and Prim algorithms. With Binary Heap, time complexity
of these algorithms is O(VLogV + ELogV). If Fibonacci Heap is used,
then time complexity is improved to O(VLogV + E)
• Although Fibonacci Heap looks promising time complexity wise, it
has been found slow in practice as hidden constants are high .
• Fibonacci heap are mainly called so because Fibonacci numbers
are used in the running time analysis. Also, every node in
Fibonacci Heap has degree at most O(log n) and the size of a
subtree rooted in a node of degree k is at least Fk+2, where Fk is the
kth Fibonacci number.
Red-black Trees, Rotations,
Insertions
Red-black trees
This data structure requires an extra one- bit
color field in each node.
Red-black properties:
1. Every node is either red or black.
2. The root and leaves (NIL’s) are black.
3. If a node is red, then its parent is black.
4. All simple paths from any node x to a
descendant leaf have the same number of
black nodes = black-height(x).
Example of a red-black tree
7
3 18
NIL NIL h=4
10 22
NIL
8 11 26
NIL NIL NIL NIL NIL NIL
L10.3
Example of a red-black tree
7
3 18
NIL NIL
10 22
NIL
8 11 26
NIL NIL NIL NIL NIL NIL
3 18
NIL NIL
10 22
NIL
8 11 26
NIL NIL NIL NIL NIL NIL
3 18
NIL NIL
10 22
NIL
8 11 26
NIL NIL NIL NIL NIL NIL
3 18 bh = 2
NIL NIL bh = 1 10 22
NIL
bh = 1 8 11 26
bh = 0 NIL NIL NIL NIL NIL NIL
4. All simple paths from any node x to a
descendant leaf have the same number of
black nodes = black-height(x).
L10.7
Height of a red-black tree
Theorem. A red-black tree with n keys has height
h 2 lg(n + 1).
L10.8
Modifying operations
The operations INSERT and DELETE cause
modifications to the red-black tree:
• the operation itself,
• color changes,
• restructuring the links of the tree:
“rotations”.
L10.9
Rotations
B RIGHT-ROTATE(A) A
A LEFT-ROTATE(B) B
10 22
8 11 26
L10.12
Insertion into a red-black tree
IDEA: Insert x in tree. Color x red. Only red-
black property 3 might be violated. Move the
violation up the tree by recoloring until it can
be fixed with rotations and recoloring.
7
Example:
3 18
• Insert x =15.
• Recolor, moving the 10 22
violation up the tree. 8 11 26
15
L10.13
Insertion into a red-black tree
IDEA: Insert x in tree. Color x red. Only red-
black property 3 might be violated. Move the
violation up the tree by recoloring until it can
be fixed with rotations and recoloring.
7
Example:
3 18
• Insert x =15.
• Recolor, moving the 10 22
violation up the tree. 8 11 26
• RIGHT-ROTATE(10).
15
L10.14
Insertion into a red-black tree
IDEA: Insert x in tree. Color x red. Only red-
black property 3 might be violated. Move the
violation up the tree by recoloring until it can
be fixed with rotations and recoloring.
7
Example:
3 10
• Insert x =15.
• Recolor, moving the 8 18
violation up the tree. 11 22
• RIGHT-ROTATE(10). 15 26
• LEFT-ROTATE(7) and recolor.
L10.15
Insertion into a red-black tree
IDEA: Insert x in tree. Color x red. Only red-
black property 3 might be violated. Move the
violation up the tree by recoloring until it can
be fixed with rotations and recoloring.
10
Example:
• Insert x =15. 7 18
• Recolor, moving the 3 8 11 22
violation up the tree. 15 26
• RIGHT-ROTATE(10).
• LEFT-ROTATE(7) and recolor.
L10.16
Insertion into a red-black tree
IDEA: Insert x in tree. Color x red. Only red-
black property 3 might be violated. Move the
violation up the tree by recoloring until it can
be fixed with rotations and recoloring.
10
Example:
• Insert x =15. 7 18
10
9
4 3
SOLUTION BY BRANCH AND BOUND :
Compute weight matrix : 1 2 3 4
1 ∞ 10 15 20
2 5 ∞ 9 10
3 6 13 ∞ 12
4 8 8 9 ∞
1 2 3 4 1 2 3 4
1 2 3 4 min
1 ∞ 0 5 10 1 ∞ 0 4 5
1 ∞ 10 15 20 10 2 0 ∞ 4 5 2 0 ∞ 3 0
2 5 ∞ 9 10 5 3 0 7 ∞ 6 3 0 7 ∞ 1
3 6 13 ∞ 12 6
8 4 0 0 1 ∞ 4 0 0 0 ∞
4 8 8 9 ∞ min 0 0 1 5 29+6
29 Reduced cost =35
CONTINUE..
1
1
min C=35
1 2 3 4 1 2 3 4
1 ∞ 0 4 5 1 ∞ ∞ ∞ ∞ - 4
2 3
2 0 ∞ 3 0 2 ∞ ∞ 3 0 0
2 3 4
3 0 7 ∞ 1 3 0 ∞ ∞ 1 0
C=35 C=40 C=40
4 0 0 0 ∞ 4 0 ∞ 0 ∞ 0
min 0 0 0 0 0 3 4
Similarly we calculate C for node 3,
C(node 2)= W[1][2]+ C(Node 1) + Reduce cost = 35 5 6
4 and put them in priority queue
1 2 3 4 min 1 2 3 4 C=39 C=35
1 ∞ ∞ ∞ ∞ - 1 ∞ ∞ ∞ ∞ MinCost =35
3
2 ∞ ∞ ∞ ∞ - 2 ∞ ∞ ∞ ∞
3 ∞ ∞ ∞ 1 1 3 ∞ ∞ ∞ 0 7
4 0 ∞ ∞ ∞ 0 4 0 ∞ ∞ ∞ C=35
1
After node 7, we found out the nodes in queue are having C > MinCost so we kill them.