1 Algorithm
1 Algorithm
Analysis Testing
Algorithm
• A complete, detailed and precise step by step method
for solving a problem independently of software or
hardware of the computer.
• A well-defined computational procedure that takes
some value, or a set of values, as input and produces
some value, or a set of values, as output.
• Sequence of computational steps that transform the
input into the output.
}
f(n) =2n^2+2n+2 f(n) =3n^2+3
1. Recursive
2. Divide and Conquer
3. Greedy Approach
4. Dynamic Programming
5. Branch and Bound
6. Backtracking
7. Randomized
Efficiency of an Algorithm (Termination and correctness)
1. Space Complexity
1 1 1 0 0 1 1
2 1 2 1 2 4 8
4 1 4 2 8 16 64
8 1 8 3 24 64 512
16 1 16 4 64 256 4096
Listed from slowest to fastest growth:
• 1
• log n
• n
• n log n
• n2
• n3
• 2n
• n!
1. for(i=0;i<n; i++) n+1
Linear Time algorithm :
{ statement(s)} n
O(n)
f(n)=2n+1
Having the expressions for best case, average case and worst case, for
all the three cases we need to identify the upper bound, lower
bounds and tight bound.
• Big O notation
• Big Omega notation
• Big theta notation
Big-Oh notation
• Big-O notation is expressed as O(n) is the order of the
magnitude of algorithm.
• Big-O notation is a way of ranking about how much time
it takes for an algorithm to execute
• How many operations will be done when the program is
executed?
• Big-O notation is concerned with what happens for large
number of elements - asymptotic order.
• Big-O notation provides a strict upper bound for f(n) .
• This means the f(n) can do better but not worse than the
specified value. Here f(n) is the number of statements
executed in the program for n data elements.
Big-Oh notation
By Definition,
if there are two functions f(n) and
g(n) for positive integer n then f(n)
= O(g(n) ) iff positive constants c
and n0 exists such that f(n) <=cg(n)
whenever c>0 for all integers n>n0.
Hence g provides upper bound. C
depends on the following factors:
• programming language used
• quality of compiler or interpreter
• CPU speed
• size of main memory and
• algorithm itself.
Big-Omega notation
By Definition,
if there are two functions f(n) and
g(n) for positive integer n then f(n)
= omega(g(n) ) iff positive
constants c and n0 exists such that
f(n) >=cg(n) whenever c>0 for all
integers n>n0. Hence g provides
lower bound. c depends on the
following factors:
• programming language used
• quality of compiler or interpreter
• CPU speed
• size of main memory and
• algorithm itself.
Big-Theta notation
By Definition,
How to find the upper bound, lower bound and average bound of
a f(n)?
Example: Find the upper bound, tight bound and lower bound of
the f(n) = 2n+3
2n+3<= 3n2
for lower bound, by definition f(n)> cg(n)
Here assume cg(n)=2n
(Note: Keep the coefficient of first term same for lower bound)
Now, by putting the values of f(n) and cg(n), we get
2n+3>2n So, c=2 and g(n) = n
For n=1, 5>2 True
• Master Method
• Iteration Method
• Recursion Tree Method
Master Method
• The problem is divided into a number of sub-problems each
of size n/b and need a f(n) to combine or break the solution.
• We can apply this method if recurrence is in the form of
• T(n) = aT(n/b) +f(n) where a>=1 and b>=1 and f(n) >= 0
• There are three cases :
• T(n)= 2T(n/2)+nlogn
• T(n)= 4T(n/2)+n^2logn
• T(n)= 2T(n/2)+n/logn
Iteration Method
5. We sum the costs within each of the levels of the tree to obtain
a set of pre-level costs and then sum all pre-level costs to
determine the total cost of all levels of the recursion.
= 2T(n/2) + n
n n
n/2 n
n/2
n/4 n/4 n
n/4 n/4
n/2k
n/2k
Total time = kn
= n log2n
n/2k = 1 2k = n k = log2n
T(n) = Θ(n log2n)
Example 1
Consider T (n) = T(n/3) + T(2n/3) + n We have to obtain the asymptotic bound using
Consider T (n) = 2T
= n log3n
T(n) = Θ(n log3n)
2n/3 n
n/3
4n/9 n
3k = n
2n/9 2n/9
n/9
n/(3/2)k
n/3k
Max.Total time = kn
n/(3/2)k = 1 (3/2)k = n k = log3/2n = n log3/2n
T(n) = Θ(n log3/2n)
Master Theorem
+ f (n) with a≥1 and b≥1 be constant & f(n) be a function and
can be interpreted as