Introduction of Asymptotic Notation: Dr. Munesh Singh
Introduction of Asymptotic Notation: Dr. Munesh Singh
of Asymptotic
Notation
Dr. Munesh
Introduction of Asymptotic Notation
Singh
Introduction
of Asymptotic
Notation In order to solve any problem in Computer Science, we
Dr. Munesh
Singh generally write programs
Before writing any program, it is always better to go with
informal description of a solution (also called algorithm)
problem
informal languaue
program in c/java
(algo/pusedo code/flow chart)
p1
if there is a problem,there
A1 A2 A3 A4 A5 may be many solution
Analysis of Algorithm
Introduction
of Asymptotic
Notation
Before implementing any algorithm as a computer
Dr. Munesh
Singh
program, we analyses the algorithms in two prospects:
Time
Space
Time designate the amount of time is take to compute the
result
Space designate the amount of memory takes your
program
The best algorithm takes less time and less memory
Design and Analysis of algorithm means:
Design : How many solutions or Algorithms to a problem
Analysis of Algorithm : Which solution is best in terms of
time and space.
Analysis of Algorithm
Introduction
of Asymptotic After analyzing an algorithm in terms of time and space,
Notation
Dr. Munesh
we choose best algorithm as the solution to a problem
Singh
problem
program in c/java
p1
A1 A2 A3 A4 A5
Introduction
of Asymptotic
Notation
Dr. Munesh
Singh
Notations
Before going to analyze any algorithm, we should get
familiar with some of the terminology used in algorithm
To represent the complexity of any algorithm, some of the
notations used in an algorithm:
1 Big oh (O) (Worst case of the algorithm)
2 Big omega (Ω) (Best case of the algorithm)
3 Big theta (Θ) (Average case of the algorithm)
Big oh (O)
Introduction
of Asymptotic Worst Case Analysis
Notation
Dr. Munesh
Let us say f(n) is a function that represent the rate of
Singh growth of time as input size n is increasing
We find out another function cg(n) such that it upper
bound to f(n)
it means f (n) <= cg (n) at c > 0, n0 >= 1, and n > n0
if above condition is satisfied, then we can call
f(n)=O(g(n))
bou f(n)<=cg(n)
r
pe
up c>0, n0>=1, n>n0
Introduction
of Asymptotic
Notation Example
Dr. Munesh
Singh f(n)=3n+2
g(n)=n
if f(n)=O(g(n)), then what it has to follow
f (n) <= cg (n) where c > 0, n0 >= 1, and n > n0
3n+2<=cn
we can substitute any value to c such that it will be
greater than the n component.
let say c=4
3n+2<=4n, when c=4
now we can substitute n value and see whether it satisfying
the above condition.
when n>=2 then the above condition is satiated, so we
can say that f(n)=O(g(n)) , when c=4, n0 >= 2, n > n0
Big oh (O)
Introduction
of Asymptotic Conclution
Notation
Dr. Munesh
From the previous calculation, we come to the conclusion
Singh that
If g(n)=cn is upper bound to f(n), then it can futher
upper bound the f(n) with any value greater than n such
that n2 , cn3 , nn , 2n
But we always takes the tightest upper bound i.e n
2n nn
n3 n2
cg(n)
f(n)
time (t)
f(n)<=cg(n)
c>0, n0>=1, n>n0
n0 input (n)
Big omega (Ω)
Introduction
of Asymptotic Best Case Analysis
Notation
Dr. Munesh
Let us say f(n) is a function that represent the rate of
Singh growth of time as input size n is increasing
We find out another function cg(n) such that it lower
bound to f(n)
it means f (n) >= cg (n) at c > 0, n0 >= 1, and n > n0
if above condition is satisfied, then we can call
f(n)=Ω(g(n))
n
ou
rb f(n)>=cg(n)
pe
up c>0, n0>=1, n>=n0
Introduction
of Asymptotic
Notation
Example
Dr. Munesh
Singh
f(n)=3n+2
g(n)=n
if f(n)=O(g(n)), then what it has to follow
f (n) >= cg (n) where c > 0, n0 >= 1, and n > n0
3n+2>=cn
we can substitute any value to c
let say c=1
3n+2>=n, when c=1
now we can substitute n value and see whether it satisfying
the above condition.
when n>=1 then the above condition is satiated, so we
can say that f(n)=Ω(g(n)) , when c=1, n0 >= 1, n > n0
Big omega (Ω)
Introduction
of Asymptotic Conclution
Notation
Dr. Munesh
From the previous calculation, we come to the conclusion
Singh that
If g(n)=cn is lower bound to f(n), then it can futher lower
bound the f(n) with any value less than n such that
logn, loglogn
But we always takes the tightest lower bound i.e n
f(n)>=cg(n)
c>0, n0>=1, n>=n0 f(n)
cg(n)
time (t)
logn
loglogn
n
Big theta (Θ)
Introduction
of Asymptotic Avrage Case Analysis
Notation
Dr. Munesh
Let us say f(n) is a function that represent the rate of
Singh growth of time as input size n is increasing
We find out another function c1g(n) such that it upper
bound to f(n), and c2g(n) lower bound to f(n).
it means c2g (n) <= f (n) <= c1g (n)
c1 > 0, c2 > 0, n0 >= 1, and n > n0
if above condition is satisfied, then we can call
f(n)=Θ(g(n))
n c2g(n)
ou
rb
pe c1g(n)<=f(n)<=c2g(n)
up
c1>0, c2>0 n0>=1, n>n0
after some limit
n0 input (n)
Big omega (Ω)
Introduction
of Asymptotic
Notation
Example
Dr. Munesh
Singh f(n)=3n+2
g(n)=n
if f(n)=O(g(n)), then what it has to follow
f (n) >= c1g (n) where c1 > 0, n0 >= 1, and n > n0
3n+2>=cn
we can substitute any value to c
let say c=1
3n+2>=n, when c=1
f (n) <= c2g (n) where c2 > 0, n0 >= 1, and n > n0
3n+2>=cn
we can substitute any value to c
let say c=4
3n+2<=n, when c=4
Comprision of Asysmptotic Notations
Introduction
of Asymptotic
Notation
Big oh (O) gives the worst case analysis of algorithm
Dr. Munesh Big omega (Ω) gives the best case of the algorithm
Singh
Big theta (Θ) gives the average case analysis of algorithm
Generally we more intersected to see the worst case
analysis of the algorithm
Average case analysis only performs when some algorithm
has worst case and best case similar
Lets take and example of an searching an element in an
array
1 The best case of of the algorithm, if we find the element at
the beginning of the array Ω(1)
2 The worst case of the algorithm, if we find the element at
the end of the array O(n)
3 The average case of the algorithm, if we find the element
at the middle of the array Θ(n/2) ' Θ(n)