Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
33 views

Algorithms Design & Analysis - Unit-3

This document discusses algorithms design and analysis, specifically covering backtracking, branch and bound, and several examples including the travelling salesman problem, graph coloring, n-queen problem, and Hamiltonian cycles. It provides explanations and examples of backtracking, branch and bound techniques. For graph coloring, it defines the problem and shows an example of coloring a graph with the minimum number of colors. It also defines and provides examples for the n-queen problem.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
33 views

Algorithms Design & Analysis - Unit-3

This document discusses algorithms design and analysis, specifically covering backtracking, branch and bound, and several examples including the travelling salesman problem, graph coloring, n-queen problem, and Hamiltonian cycles. It provides explanations and examples of backtracking, branch and bound techniques. For graph coloring, it defines the problem and shows an example of coloring a graph with the minimum number of colors. It also defines and provides examples for the n-queen problem.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 90

Madan Mohan Malaviya Univ.

of Technology, Gorakhpur

Algorithms Design and Analysis


(MCA-129)

[MCA- IVth Sem, Session: 2020-21]

Anu Raj & Jyoti Srivastava


Department of Information Technology & Computer Applications
MMM University of Technology Gorakhpur-273010
Email: anu.raj10@yahoo.com, sriv.jyoti1996@gmail.com

14-06-2021 Side 1
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

UNIT – III
Backtracking, Branch and Bound with examples such as Travelling Salesman
Problem, Graph Coloring, n-Queen Problem, Hamiltonian Cycles and Sum of
subsets, Amortized Analysis. Advanced Data Structure: Red Black Trees,
Augmenting Data Structure, B-Tree, Binomial Heap, Fibonacci Heap, and Data
Structure for Disjoint Sets, priority Queues, mergeable heaps, concatenable
queues

14-06-2021 Side 2
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

BACKTRACKING
• Backtracking is a technique used to solve problems with a large search
space, by systematically trying and eliminating possibilities.
• The principle idea of back-tracking is to construct solutions as component
at a time. And then evaluate such partially constructed solutions.
• Finds all(or some) solutions to some computational problems, notably
constraint satisfaction problem.
• Incrementally builds candidates to the solutions, and abandons a candidate
(“backtracks”) as soon as it determines the candidate will not lead to a
valid solution.
• Can be applied only for problems which admit the concept of a “partial
candidate solution” and a relatively quick test for completeness to a valid
solution.

14-06-2021 Side 3
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

BACKTRACKING

14-06-2021 Side 4
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Backtracking Algorithm – Example

• Find path through maze


• – Start at beginning of maze
• – If at exit, return true
• – Else for each step from current location
• Recursively find path
• Return with first successful step
• Return false if all steps fail

14-06-2021 Side 5
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Backtracking Algorithm – Example


• A standard
example of
backtracking
would be going
through a maze.
• – At some point in
a maze, you might
have two options
of which direction
to go

14-06-2021 Side 6
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Backtracking Algorithm – Example


• The backtracking strategy
says to try each choice, one
after the other,
• – if you ever get stuck,
"backtrack" to the junction
and try the next choice.
• If you try all choices and
never found a way out,
then there IS no solution to the maze.

14-06-2021 Side 7
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

BRANCH AND BOUND

• Branch and Bound is a state space search method in which all the
children of a node are generated before expanding any of its children.
• Starting by considering the root node and applying a lower-bounding
and upper bounding procedure to it
• If the bounds match, then an optimal solution has been found and the
algorithm is finished
• If they do not match, then algorithm runs on the child nodes

14-06-2021 Side 8
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

BRANCH AND BOUND- EXAMPLE

14-06-2021 Side 9
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Travelling salesman problem

• In this problem we are given an n vertex network (either directed


or undirected) and are to find a cycle of minimum cost that includes
all n vertices.
• Any cycle that includes all n vertices of a network is called a tour.
In the traveling- salesperson problem, we are to find a least-cost
tour.
• Definition: Find a tour of minimum cost starting from a node S
going through other nodes only once and returning to the starting
point S.

14-06-2021 Side 10
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Travelling salesman Problem-Definition


3
•Let us look at a situation that there are
5 cities, Which are represented as
NODES
•There is a Person at NODE-1
•This PERSON HAS TO REACH
EACH NODES ONE AND ONLY
ONCE AND COME BACK TO
4
ORIGINAL (STARTING)POSITION.
2
•This process has to occur with
minimum cost or minimum distance
travelled.
•Note that starting point can start with
any Node. For Example:
1-5-2-3-4-1
2-3-4-1-5-2
1 5
14-06-2021 Side 11
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Travelling salesman Problem

• If there are ‘n’ nodes there are (n-1)! Feasible solutions


• From these (n-1)! Feasible solutions we have to find
OPTIMAL SOLUTION.
• This can be related to GRAPH THEORY.
• Graph is a collection of Nodes and Arcs(Edges).

14-06-2021 Side 12
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

GRAPH COLORING
• Graph coloring is the procedure of assignment of colors to each vertex of
a graph G such that no adjacent vertices get same color. ... The smallest
number of colors required to color a graph G is called
its chromatic number of that graph
• Method to Color a Graph
The steps required to color a graph G with n number of vertices are as follows

• Step 1 − Arrange the vertices of the graph in some order.
• Step 2 − Choose the first vertex and color it with the first color.
• Step 3 − Choose the next vertex and color it with the lowest numbered
color that has not been colored on any vertices adjacent to it. If all the
adjacent vertices are colored with this color, assign a new color to it. Repeat
this step until all the vertices are colored.

14-06-2021 Side 13
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

GRAPH COLORING

Minimum number of color required for this graph =3

14-06-2021 Side 14
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Map Coloring
• Let G be a graph and m be a given positive integer. We want to
discover whether the nodes of G can be colored in such a way that no
two adjacent node have the same color yet only m colors are used.
This technique is broadly used in “map-coloring”; Four-color map is
the main objective.
• Consider the following map and it can be easily decomposed into
the following planner graph beside it :

14-06-2021 Side 15
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Graph Coloring

Find the minimum number of colors required for the


graph ?

14-06-2021 Side 16
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Graph Coloring

14-06-2021 Side 17
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Graph Coloring

(a) (b)

Find the minimum number of


colors required for the graph ?

(c)
14-06-2021 Side 18
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

N-QUEEN PROBLEM

Problem:- The problem is to place n queens on an n-


by-n chessboard so that no two queens
attack each other by being in the same row,
or in the same column, or in the same
diagonal.
Observation:- Case 1: n=4

14-06-2021 Side 19
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

4-QUEEN PROBLEM
•Case 1: For example to explain the n- Queen problem we
Consider n=4 using a 4- by-4 chessboard where 4-Queens
have to be placed in such a way so that no two queen can
attack each other.
1 2 3 4
1
2
3
4

14-06-2021 Side 20
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Q
Q
5
1

Q Q x x x Q
x x Q Q Q
2 3 6

Q x x Q
x x x x Q Q
Q 4 Q 7

x x x x x x Q
Q
Q 8
Q
14-06-2021 Side 21
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

4-QUEEN PROBLEM
•Using this above mechanism we can obtain two solutions shown in
the two consecutive figures:-
1 2 3 4

1
Q Queen-1

2 Q Queen-2

3 Q Queen-3

4 Q Queen-4

Figure:-Board for the four-queens problem


14-06-2021 Side 22
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

4-QUEEN PROBLEM

1 2 3 4

1
Q Queen-1

2 Q Queen-2

3 Q Queen-3

4 Q Queen-4

Figure:-Board for the four-queens problem

14-06-2021 Side 23
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

N-QUEEN PROBLEM ALGORITHM


1)Start in the leftmost column
2) If all queens are placed
return true
3) Try all rows in the current column.
Do following for every tried row.
a) If the queen can be placed safely in this row then mark this [row, column] as
part of the solution and recursively check if placing queen here leads to a
solution.
b) If placing the queen in [row, column] leads to a solution then return true.
c) If placing queen doesn't lead to a solution then unmark this [row, column]
(Backtrack) and
go to step (a) to try other rows.
4) If all rows have been tried and nothing worked, return false to trigger
backtracking.

14-06-2021 Side 24
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Hamiltonian Circuit Problem

Problem: This problem is concern about finding


a Hamiltonian circuit in a given
graph.

Hamiltonian Hamiltonian circuit is defined as a


circuit: cycle that passes to all the vertices of
the graph exactly once except the
starting and ending vertices that is the
same vertex.

14-06-2021 Side 25
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

For example consider the given graph and


evaluate the mechanism:-

(a)
(b)
Figure: • (a) Graph.
• (b) State-space tree for finding a Hamiltonian circuit. The
numbers above the nodes of the tree indicate the order in
which nodes are generated.
14-06-2021 Side 26
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Hamiltonian Circuit Problem

Consider the above graph find the Hamiltonian Circuit


of the graph .
14-06-2021 Side 27
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Hamiltonian Circuit Problem

14-06-2021 Side 28
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Subset-sum Problem

• Subset-sum The problem is to find a subset of a given set


S = {s1, s2,- - -, sn} of ‘n’ positive integers
Problem: whose sum is equal to a given positive
integer ‘d’.

• Observation : It is convenient to sort the set’s elements in


increasing order, S1 ≤ S2 ≤ ….. ≤ Sn. And each
set of solutions don’t need to be necessarily of
fixed size.

• Example : ForS= {3, 5, 6, 7} and d = 15, the solution is


shownbelow :-
Solution= {3, 5, 7}

14-06-2021 Side 29
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

0 0

with w/o 3
3

3 0
3
with w/o 5 with w/o 5
5 5

8 3 5 0
5

with with w/o 6 0+6+7<15


w/o 6 with w/o 6 x
6 6
6

14 8 9 3 11 5
6

14+7>15 with w/o 7 9+7>15 3+7<15 11+7>15 5+7<15


x 7 x x x x

7 15 8

solution 8<15
x

Compete state-space tree of the backtracking algorithm applied to the instance S =


Figure : {3, 5, 6, 7} and d = 15 of the subset-sum problem. The number inside a node is the
sum of the elements already included in subsets represented by the node. The
inequality below a leaf indicates the reason for its termination.

14-06-2021 Side 30
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Question 1: Given a set S = (3, 4, 5, 6) and X =9.


Obtain the subset sum using Backtracking approach.

Question 2: Given a set S = (1,3,4,5) and X =8. Obtain


the subset sum using Backtracking approach.

14-06-2021 Side 31
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Question1: Given a set S = (3, 4, 5, 6) and X =9.


Obtain the subset sum using Backtracking approach.

14-06-2021 Side 32
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

AMORTIZED ANALYSIS

• Not just consider one operation, but a sequence of operations


on a given data structure.
• Average cost over a sequence of operations.
Probabilistic analysis: – Average case running time: average
over all possible inputs for one algorithm (operation). – If using
probability, called expected running time.
• Amortized analysis:
– No involvement of probability
– Average performance on a sequence of operations, even some
operation is expensive.
– Guarantee average performance of each operation among the
sequence in worst case.

14-06-2021 Side 33
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Three Methods of Amortized Analysis


• Aggregate analysis: – We show that a sequence of n operations
take worst case time T(n) in total.
• In the aggregate method, all operations have the same amortized
cost.
• Accounting method:
– Assign each type of operation an (different) amortized cost
– overcharge some operations,
– store the overcharge as credit on specific objects,
– then use the credit for compensation for some later operations.
• Potential method:
– Same as accounting method
– But store the credit as “potential energy” and as a whole

14-06-2021 Side 34
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Amortized Analysis
Example: Stack operations:
Push(S, x): pushes object x onto stack S
Pop(S): pops the top of the stack S and returns the
poped object
Multipop(S, k): Removes the k top objects of stack S
The action of Multipop on a stack S is as follows:
Multipop (S, k)
while not STACK-EMPTY(S) and k ≠ 0 do POP(s)
k k –1

14-06-2021 Side 35
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Aggregate Analysis
• Using the aggregate method of amortized analysis, we can obtain
a tighter upper bound that considers the entire sequence of n
operations
• In fact, although a single Multipop operation can be expensive,
any sequence of n Push, Pop, and Multipop operations on an
initially empty stack can cost at most O(n).
• Because each object can be poped at most once for each time it is
pushed. Therefore, the number of times that Pop can be called on
a nonempty stack, including calls within Multipop, is at most the
number of Push, which is at most n. For any value of n, any
sequence of n Push, Pop, and Multipop operations takes a total of
O(n) time.
• The amortized cost of an operation is the average: O(n)/n = O(1).
14-06-2021 Side 36
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Amortized Analysis: Accounting Method


• Idea:
– Assign differing charges to different operations.
– The amount of the charge is called amortized cost.
– amortized cost is more or less than actual cost.
– When amortized cost > actual cost, the difference is
saved in specific objects as credits.
– The credits can be used by later operations whose
amortized cost < actual cost.
• As a comparison, in aggregate analysis, all operations have
same amortized costs.

14-06-2021 Side 37
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Amortized Analysis: Accounting Method


• Example-1: stack operations:
• The actual costs of the operations were,
• Push 1,
• Pop 1,
• Multipop min(k, s) 0,
• where k is the argument supplied to Multipop and s is the stack size when
it is called.
• We assign the following amortized costs:
• Push 2,
• Pop 0,
• Multipop 0.
• Here all three amortized costs are O(1), although in general the amortized
costs of the operations under consideration may differ asymptotically.
14-06-2021 Side 38
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

THE POTENTIAL METHOD

• Different from accounting method


– The prepaid work not as credit, but as “potential
energy”, or “potential”.
– The potential is associated with the data structure
as a whole rather than with specific objects within
the data structure.
– Slightly different as accounting method the work
use later.

14-06-2021 Side 39
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Disjoint Sets Data Structure


• A disjoint-set is a collection ={S1, S2,…, Sk} of
distinct dynamic sets.
• Each set is identified by a member of the set, called
representative.
• Disjoint set operations:
• MAKE-SET(x): create a new set with only x. assume x
is not already in some other set.
• UNION(x, y): combine the two sets containing x and y
into one new set. A new representative is selected.
• FIND-SET(x): return the representative of the set
containing x.
14-06-2021 Side 40
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

B = {3, 4}
3

B is assigned number 4 as representative

14-06-2021 Side 41
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Disjoint Sets Data Structure


• A disjoint-set data structure is a data structure that keeps track
of a set of elements partitioned into a number of disjoint (non-
overlapping) subsets.
• A union-find algorithm is an algorithm that performs two
useful operations on such a data structure:
Find: Determine which subset a particular element is in. This
can be used for determining if two elements are in the same
subset.
Union: Join two subsets into a single subset.
The application is to check whether a given graph contains a
cycle or not.
Union-Find Algorithm can be used to check whether an
undirected graph contains cycle or not.
14-06-2021 Side 42
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

B = {3, 4} A = {1,2}
3 1

4 2

Want to do A union B
1
We have:
2
3

4
14-06-2021 Side 43
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Disjoint-set Implementation: Forests

• Rooted trees, each tree is a set, root is the representative.


Each node points to its parent. Root points to itself.

c cf cf
h e d c d

Set {c,h,e} Set {f,d} h e UNION

14-06-2021 Side 44
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

QUEUE

•Queue is the linear data structure type which is used to organize


the data.
•It is used for temporary storage of data values.
•A new element is added at one end called rear end
•The existing element deleted from the other end is called front
end.
•First-in-First-out property.

14-06-2021 Side 45
45
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Simple Queue

• Simple queue defines the simple operation of


queue in which insertion occurs at the rear of
the list and deletion occurs at the front of the
list.

14-06-2021 Side 46
46
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Priority Queue

• Priority queue contains


data items which have
some preset priority.
While removing an
element from a priority
queue, the data item with
the highest priority is
removed first.
• In a priority queue,
insertion is performed in
the order of arrival and
deletion is performed
based on the priority.

14-06-2021 Side 47
47
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Priority Queue

• Elements in a priority queue are ordered.


• Operations of retrieving and removing the largest
element are supported (remove Max).
• Priority queues are used in sorting algorithms. A heap is
one possible realization of the priority queue. A buffer
tree is an example of an external memory realization of
the priority queue.
Examples of the priority queue data structures:
• min-max heap - a double-ended priority queue
implemented as a modified version of a binary heap

14-06-2021 Side 48
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Priority Queue

• binomial heap - an implementation of the mergeable heap abstract


data type which is a priority queue supporting merge operation
• Fibonacci heap - a heap data structure consisting of a collection of
trees
Applications of the priority queue:
• Sorting e.g. selection sort, insertion sort, heap sort. The idea is to
insert elements to the queue one-by-one and then remove them
from the queue in decreasing order using remove Max

14-06-2021 Side 49
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Concatenable Queue
• from 2-3 trees we know how insert(), delete(), min()
operation can be executed on 2-3 trees with n nodes in at
most O(log n) complexity
• in concatenable queue, we will show how instruction
concatenate() and split() can be executed in O(log n)
time
• It supports insert, delete, find, concatenate, and split
operations that take O(log n) time.

14-06-2021 Side 50
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Concatenable Queue

• as we know in 2-3 tree, we calculate L[v] and M[v] for


each vertex v, so to execute Concatenate() and split()
operation we are assuming we have calculated L[v] and
M[v].
14-06-2021 Side 51
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Concatenable Queue

Concatenate(S1,S2)
• takes two input sequences s1 and s2 such that every
element of s1 is less than every element of s2 and
produce new concatenated sequence of S1S2...
• if S1 is represented as a 2-3 tree T1 and S2 as a 2-3 tree
T2, then we want to combine T1 and T2 into a single 2-3
tree T having as leaves of as the leaves of T1 in their
original order followed by the leaves of T2 in their
original order

14-06-2021 Side 52
Madan Mohan Malaviya Univ. of Technology, Gorakhpur

Concatenable Queue

SPLIT operation
• Split(a,S)--- split S set into two partition S1 and S2
• S1 is from starting of S to a
• S2 is from a to end of S Set
• i mean to split a 2-3 tree T into two 2-3 trees T1 and T2
such that all leaves in T1 have less than or equal to a and
all leaves in T2 have leaves greater than a

14-06-2021 Side 53
B- TREE
• B-Tree is a self-balancing search tree. In most of the other self-balancing search
trees (like AVL and Red-Black Trees), it is assumed that everything is in main
memory.
• To understand the use of B-Trees, we must think of the huge amount of data
that cannot fit in main memory. When the number of keys is high, the data is
read from disk in the form of blocks. Disk access time is very high compared to
the main memory access time. The main idea of using B-Trees is to reduce the
number of disk accesses.
• Most of the tree operations (search, insert, delete, max, min, ..etc ) require O(h)
disk accesses where h is the height of the tree. B-tree is a fat tree. The height of
B-Trees is kept low by putting maximum possible keys in a B-Tree node.
• Generally, the B-Tree node size is kept equal to the disk block size. Since the
height of the B-tree is low so total disk accesses for most of the operations are
reduced significantly compared to balanced Binary Search Trees like AVL
Tree, Red-Black Tree, ..etc.
B- TREE
• Time Complexity of B-Tree:

Sr. No. Algorithm Time Complexity

1. Search O(log n)

2. Insert O(log n)

3. Delete O(log n)
PROPERTIES OF B- TREE
1.All leaves are at the same level.
2.A B-Tree is defined by the term minimum degree ‘t’. The value of t
depends upon disk block size.
3.Every node except root must contain at least (ceiling)([t-1]/2) keys.
The root may contain minimum 1 key.
4.All nodes (including root) may contain at most t – 1 keys.
5.Number of children of a node is equal to the number of keys in it plus
1.
6.All keys of a node are sorted in increasing order. The child between
two keys k1 and k2 contains all keys in the range from k1 and k2.
7.B-Tree grows and shrinks from the root which is unlike Binary Search
Tree. Binary Search Trees grow downward and also shrink from
downward.
8.Like other balanced Binary Search Trees, time complexity to search,
insert and delete is O(log n).
B-TREE
• Following is an example of B-Tree of minimum order 5. Note that in practical
B-Trees, the value of the minimum order is much more than 5.

• We can see in the below diagram that all the leaf nodes are at the same
level and all non-leaf have no empty sub-tree and have keys one less than
the number of their children.
IMPORTANT FACT
• Traversal in B-Tree: Traversal is also similar to
Inorder traversal of Binary Tree. We start from
the leftmost child, recursively print the
leftmost child, then repeat the same process
for remaining children and keys. In the end,
recursively print the rightmost child.
B- TREE
• Search Operation in B-Tree:
Search is similar to the search in Binary Search Tree. Let the key to
be searched be k. We start from the root and recursively traverse
down. For every visited non-leaf node, if the node has the key, we
simply return the node. Otherwise, we recur down to the
appropriate child (The child which is just before the first greater
key) of the node. If we reach a leaf node and don’t find k in the
leaf node, we return NULL.
• Logic:
Searching a B-Tree is similar to searching a binary tree. The
algorithm is similar and goes with recursion. At each level, the
search is optimised as if the key value is not present in the range
of parent then the key is present in another branch. As these
values limit the search they are also known as limiting value or
separation value. If we reach a leaf node and don’t find the
desired key then it will display NULL.
Example: Searching 120 in the given
B-Tree.
EXAMPLE
EXAMPLE
EXAMPLE
EXAMPLE
• In this example, we can see that our search was
reduced by just limiting the chances where the key
containing the value could be present. Similarly if
within the above example we’ve to look for 180, then
the control will stop at step 2 because the program
will find that the key 180 is present within the current
node. And similarly, if it’s to seek out 90 then as 90 <
100 so it’ll go to the left subtree automatically and
therefore the control flow will go similarly as shown
within the above example.
Binomial Heap
• The main application of Binary Heap is as implement priority
queue. Binomial Heap is an extension of Binary Heap that provides
faster union or merge operation together with other operations
provided by Binary Heap.
• A Binomial Heap is a collection of Binomial Trees
• What is a Binomial Tree?
A Binomial Tree of order 0 has 1 node. A Binomial Tree of order k
can be constructed by taking two binomial trees of order k-1 and
making one as leftmost child or other.
A Binomial Tree of order k has following properties.
a) It has exactly 2k nodes.
b) It has depth as k.
c) There are exactly kCi nodes at depth i for i = 0, 1, . . . , k.
d) The root has degree k and children of root are themselves
Binomial Trees with order k-1, k-2,.. 0 from left to right
Binomial Heap
• Binomial Heap:
A Binomial Heap is a set of Binomial Trees where each
Binomial Tree follows Min Heap property. And there can be
at most one Binomial Tree of any degree.
• Binary Representation of a number and Binomial Heaps
A Binomial Heap with n nodes has the number of Binomial
Trees equal to the number of set bits in the Binary
representation of n. For example let n be 13, there 3 set bits
in the binary representation of n (00001101), hence 3
Binomial Trees. We can also relate the degree of these
Binomial Trees with positions of set bits. With this relation,
we can conclude that there are O(Logn) Binomial Trees in a
Binomial Heap with ‘n’ nodes.
Operations of Binomial Heap:
• The main operation in Binomial Heap is union(), all other operations mainly use this
operation. The union() operation is to combine two Binomial Heaps into one. Let us first
discuss other operations, we will discuss union later.
• insert(H, k): Inserts a key ‘k’ to Binomial Heap ‘H’. This operation first creates a Binomial
Heap with single key ‘k’, then calls union on H and the new Binomial heap.
• getMin(H): A simple way to getMin() is to traverse the list of root of Binomial Trees and
return the minimum key. This implementation requires O(Logn) time. It can be optimized to
O(1) by maintaining a pointer to minimum key root.
• extractMin(H): This operation also uses union(). We first call getMin() to find the minimum
key Binomial Tree, then we remove the node and create a new Binomial Heap by connecting
all subtrees of the removed minimum node. Finally, we call union() on H and the newly
created Binomial Heap. This operation requires O(Logn) time.
• delete(H): Like Binary Heap, delete operation first reduces the key to minus infinite, then calls
extractMin().
• decreaseKey(H): decreaseKey() is also similar to Binary Heap. We compare the decreases key
with it parent and if parent’s key is more, we swap keys and recur for the parent. We stop
when we either reach a node whose parent has a smaller key or we hit the root node. Time
complexity of decreaseKey() is O(Logn).
FIBONACCI HEAP
• A Fibonacci heap is a collection of trees satisfying
the minimum-heap property, that is, the key of a
child is always greater than or equal to the key of
the parent. This implies that the minimum key is
always at the root of one of the trees. Compared
with binomial heaps, the structure of a Fibonacci
heap is more flexible. The trees do not have a
prescribed shape and in the extreme case the
heap can have every element in a separate tree.
FIBONACCI HEAP
Facts about Fibonacci Heap
• The reduced time complexity of Decrease-Key has importance in
Dijkstra and Prim algorithms. With Binary Heap, time complexity
of these algorithms is O(VLogV + ELogV). If Fibonacci Heap is used,
then time complexity is improved to O(VLogV + E)
• Although Fibonacci Heap looks promising time complexity wise, it
has been found slow in practice as hidden constants are high .
• Fibonacci heap are mainly called so because Fibonacci numbers
are used in the running time analysis. Also, every node in
Fibonacci Heap has degree at most O(log n) and the size of a
subtree rooted in a node of degree k is at least Fk+2, where Fk is the
kth Fibonacci number.
Red-black Trees, Rotations,
Insertions
Red-black trees
This data structure requires an extra one- bit
color field in each node.
Red-black properties:
1. Every node is either red or black.
2. The root and leaves (NIL’s) are black.
3. If a node is red, then its parent is black.
4. All simple paths from any node x to a
descendant leaf have the same number of
black nodes = black-height(x).
Example of a red-black tree
7

3 18
NIL NIL h=4
10 22
NIL
8 11 26
NIL NIL NIL NIL NIL NIL

L10.3
Example of a red-black tree
7

3 18
NIL NIL
10 22
NIL
8 11 26
NIL NIL NIL NIL NIL NIL

1. Every node is either red or black.


L10.4
Example of a red-black tree
7

3 18
NIL NIL
10 22
NIL
8 11 26
NIL NIL NIL NIL NIL NIL

2. The root and leaves (NIL’s) are black.


L10.5
Example of a red-black tree
7

3 18
NIL NIL
10 22
NIL
8 11 26
NIL NIL NIL NIL NIL NIL

3. If a node is red, then its parent is black.


L10.6
Example of a red-black tree
7 bh = 2

3 18 bh = 2
NIL NIL bh = 1 10 22
NIL
bh = 1 8 11 26
bh = 0 NIL NIL NIL NIL NIL NIL
4. All simple paths from any node x to a
descendant leaf have the same number of
black nodes = black-height(x).
L10.7
Height of a red-black tree
Theorem. A red-black tree with n keys has height
h  2 lg(n + 1).

L10.8
Modifying operations
The operations INSERT and DELETE cause
modifications to the red-black tree:
• the operation itself,
• color changes,
• restructuring the links of the tree:
“rotations”.

L10.9
Rotations
B RIGHT-ROTATE(A) A
A LEFT-ROTATE(B) B
 
   

Rotations maintain the inorder ordering of keys:


• a  , b  , c    a  A  b  B  c.
A rotation can be performed in O(1) time.
L10.10
Insertion into a red-black tree
Insertion into a red-black tree
IDEA: Insert x in tree. Color x red. Only red-
black property 3 might be violated. Move the
violation up the tree by recoloring until it can
be fixed with rotations and recoloring.
7
Example:
3 18

10 22

8 11 26

L10.12
Insertion into a red-black tree
IDEA: Insert x in tree. Color x red. Only red-
black property 3 might be violated. Move the
violation up the tree by recoloring until it can
be fixed with rotations and recoloring.
7
Example:
3 18
• Insert x =15.
• Recolor, moving the 10 22
violation up the tree. 8 11 26

15

L10.13
Insertion into a red-black tree
IDEA: Insert x in tree. Color x red. Only red-
black property 3 might be violated. Move the
violation up the tree by recoloring until it can
be fixed with rotations and recoloring.
7
Example:
3 18
• Insert x =15.
• Recolor, moving the 10 22
violation up the tree. 8 11 26
• RIGHT-ROTATE(10).
15

L10.14
Insertion into a red-black tree
IDEA: Insert x in tree. Color x red. Only red-
black property 3 might be violated. Move the
violation up the tree by recoloring until it can
be fixed with rotations and recoloring.
7
Example:
3 10
• Insert x =15.
• Recolor, moving the 8 18
violation up the tree. 11 22
• RIGHT-ROTATE(10). 15 26
• LEFT-ROTATE(7) and recolor.
L10.15
Insertion into a red-black tree
IDEA: Insert x in tree. Color x red. Only red-
black property 3 might be violated. Move the
violation up the tree by recoloring until it can
be fixed with rotations and recoloring.
10
Example:
• Insert x =15. 7 18
• Recolor, moving the 3 8 11 22
violation up the tree. 15 26
• RIGHT-ROTATE(10).
• LEFT-ROTATE(7) and recolor.
L10.16
Insertion into a red-black tree
IDEA: Insert x in tree. Color x red. Only red-
black property 3 might be violated. Move the
violation up the tree by recoloring until it can
be fixed with rotations and recoloring.
10
Example:
• Insert x =15. 7 18

• Recolor, moving the 3 8 11 22


violation up the tree. 15 26
• RIGHT-ROTATE(10).
• LEFT-ROTATE(7) and recolor.
L10.24
Pseudocode
RB-INSERT(T, x)
TREE-INSERT(T, x)
color[x]  RED ⊳ only RB property 3 can be violated
while x  root[T] and color[p[x]] = RED
do if p[x] = left[p[p[x]]
then y  right[p[p[x]] ⊳ y = aunt/uncle of x
if color[y] = RED
then Case 1
else if x = right[p[x]]
then Case 2 ⊳ Case 2 falls into Case 3
Case 3
else “then” clause with “left” and “right” swapped
color[root[T]]  BLACK
L10.18
EXAMPLE : TRAVELLING SALESMAN
PROBLEM
• PROBLEM DISCRIPTION : Given a set of cities and distance between every
pair of cities, the problem is to find the shortest possible tour that the
salesman must take to visit every city exactly once and return to the starting
point.
solution
10
1 2
6

10
9
4 3
SOLUTION BY BRANCH AND BOUND :
Compute weight matrix : 1 2 3 4
1 ∞ 10 15 20
2 5 ∞ 9 10
3 6 13 ∞ 12
4 8 8 9 ∞

Lets compute the reduced matrix which provide the shortest


distance from the matrix
Let Start with City 1 . Calculation of Reduced Cost for node 1

1 2 3 4 1 2 3 4
1 2 3 4 min
1 ∞ 0 5 10 1 ∞ 0 4 5
1 ∞ 10 15 20 10 2 0 ∞ 4 5 2 0 ∞ 3 0
2 5 ∞ 9 10 5 3 0 7 ∞ 6 3 0 7 ∞ 1
3 6 13 ∞ 12 6
8 4 0 0 1 ∞ 4 0 0 0 ∞
4 8 8 9 ∞ min 0 0 1 5 29+6
29 Reduced cost =35
CONTINUE..
1
1
min C=35
1 2 3 4 1 2 3 4
1 ∞ 0 4 5 1 ∞ ∞ ∞ ∞ - 4
2 3
2 0 ∞ 3 0 2 ∞ ∞ 3 0 0
2 3 4
3 0 7 ∞ 1 3 0 ∞ ∞ 1 0
C=35 C=40 C=40
4 0 0 0 ∞ 4 0 ∞ 0 ∞ 0
min 0 0 0 0 0 3 4
Similarly we calculate C for node 3,
C(node 2)= W[1][2]+ C(Node 1) + Reduce cost = 35 5 6
4 and put them in priority queue
1 2 3 4 min 1 2 3 4 C=39 C=35
1 ∞ ∞ ∞ ∞ - 1 ∞ ∞ ∞ ∞ MinCost =35
3
2 ∞ ∞ ∞ ∞ - 2 ∞ ∞ ∞ ∞
3 ∞ ∞ ∞ 1 1 3 ∞ ∞ ∞ 0 7
4 0 ∞ ∞ ∞ 0 4 0 ∞ ∞ ∞ C=35
1

C(node 5)= W[2][3]+ C(Node 2) + Reduce cost = 3+35+1=39

Similarly we calculate C for node 6 and put


them in queue

After node 7, we found out the nodes in queue are having C > MinCost so we kill them.

You might also like