Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
22 views

Algorithm Analysis Module 1 Important Topics

Uploaded by

erinpaul12112002
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views

Algorithm Analysis Module 1 Important Topics

Uploaded by

erinpaul12112002
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 30

Algorithm-Analysis-Module-1-Important-

Topics
For more notes visit

https://rtpnotes.vercel.app

Algorithm-Analysis-Module-1-Important-Topics
1. Methods to solve Recurrence Equation
2. Iteration Method
Example 1
Example 2
Example 3
3. Recursion Tree method
Example Problem
Tree for Equation 1
Tree for Equation 2
Tree for Equation 3
Merging the 3 trees together

4. Substitution Method
How to do the substitution method?
Example
Step 1: Guess the form of the answer
Step 2
Step 3
Step 4
Step 5

5. Master theorem
What is Masters Theorem?
Statement
Master theorem Example 1
Master theorem Example 2
Master theorem Example 3
6. Asymptotic Notations
Theta Notation
Definition
Graph
Problems based on Theta Notations
Problem 1
Problem 2
Problem 3
Problem 4

Big Oh Notation
Definition
Graph
Problems based on Big Oh Notations
Problem 1
Problem 2

Big Omega
Definition
Problems based on Big Omega
Problem 1

Little oh notation
Definition
Problems based on Little oh notation
Problem 1
Problem 2

Little Omega Notation


Definition
Problems based on Little Omega
Problem 1

7. Time complexity,space complexity


Space Complexity
Example
Time Complexity
8. Analysis of algorithms
Linear Search
Algorithm
Best input
Worst case input
Average Case Input

Insertion sort
Algorithm
Total time taken for execution of insertion sort
Best case analysis of insertion sort
Worst case
Average case

9. Best, Worst and Average Case Complexities


Example: Linear search

1. Methods to solve Recurrence Equation


Function on the left side is occuring in the right side with lesser argument, Then its a
recurrence equation.
For example

The methods to solve recurrence equation are


Iteration Method
Recursion Tree Method
Substitution method
Masters theorem

2. Iteration Method
Example 1

Consider the following Recurrence Relation


T (n) = 1 + T (n − 1)

We have to solve this equation using iterative method


Find T (n − 1) , T (n − 2) etc...
T (n) = 1 + [1 + T (n − 2)] = 2 + T (n − 2)

T (n) = 2 + [1 + T (n − 3)] = 3 + T (n − 3)

Repeat this k times


T (n) = k + T (n − k)

Lets assume n − k = 1, k = n - 1
T (n) = n − 1 + T (1)

Expression in terms of Time Complexity


T (n) = n − 1 + T (1) = O(n) + O(1)
The time complexity is O(n)

Example 2

Repeating k times
Subbing these values, we get

Example 3

3. Recursion Tree method


Example Problem
Lets go over the equations one by one

Tree for Equation 1

*Root node is LHS, ie T(n)


From the equation we can understand
Time Complexity
Total input size
No of subproblems/divisions
Size of division
Work incurred in division and getting final solution

No of Child Nodes = No of Problem divisions


From the above equation, No of problem divisions = 2
So we need to create 2 child nodes
Lets take 2 child nodes
The Tree can be also be expressed in terms of cost

Tree for Equation 2

The given equation

No of divisions = 2
Size of division = n/4
Cost = Cn/2
Input size = n/2

Tree for Equation 3

No of divisions = 2
Size of division = n/8
Cost = cn/4

Merging the 3 trees together


The height of this tree can be calculated by the formula

No of nodes at the last level

We can calculate the total cost by


Total Cost = log(n+1) x Cn
Cnlogn + Cn
When ignoring Cn and c, we get the complexity

4. Substitution Method
How to do the substitution method?

1. Guess the form of the solution


2. Prove that is guess is correct by performing a valid substitution
3. To guess a solution we can use Recursion tree method or iteration method
4. To prove the validity of the guess use induction proof method

Example

Step 1: Guess the form of the answer


Lets the draw the tree

Now to calculate the complexity, we have to use this formula


T(n) = (Count of levels) * Cn
= (count of 0,1,2,... logn) x cn
Logn is taken because we know that
Height of the tree is Logn

= (logn + 1) x cn
= cnlogn + cn
= O(n logn)

Step 2

From the above, our guess for the complexity is

Step 3

From the above guess, write the form of equation for T(n/2)

Step 4
Substitute the previous equation into equation 1 (Step 1)
Equation 1

Equation 2

Subbing 2 in 1

Ignoring 2

Step 5

We are getting T(n)<= cnlogn


Now we have to find out T(1), T(2), T(3)
5. Master theorem
What is Masters Theorem?

Its a cookbook method


Its associated with divide and conquer paradigm (DAC)

General form of DAC is

f(n) is the amount involved in splitting

Statement
Where n / b is either a floor or ceil operation

T(n) can be the following based on the conditons

1. Case 1

1.
2. Case 2

1.
3. Case 3

1.

2.

Master theorem Example 1

First lets write the general form


Lets compare it with the question
a=9
b=3
f(n) = n

Now lets find n log


b
a

n
log
b
a
=n log
3
9
=n 2

Lets compare f(n) and n log


b
a

f(n) is smaller - Case 1


Both are equal - Case 2
f(n) is larger - Case 3
In this case n < n 2

Which means its case 1, f(n) is smaller


Applying case 1 of master theorem

Master theorem Example 2

First lets write the general form


Lets compare it with the question
a=1
b = 3/2
f(n) = 1

Now lets find n log


b
a

F(n) and n log


b
a
are equal = 1
Applying Case 2 of Master Theorem

Master theorem Example 3

First lets write the general form

Lets compare it with the question


a=3
b=4
f(n) = nlogn
Now lets find n log
b
a
Here F(n) is larger than n log
b
a

Checking case 3 of Masters theorem

By the condition, we need to add epsilon and verify

We Have another condition

The C value is less than 1, This satisfies case 3


6. Asymptotic Notations
Theta Notation

Lets consider 2 functions


f(n) and g(n)
The argument n is input size
n0 -> Input size boundary

Definition

Graph
Problems based on Theta Notations

Problem 1

Prove that f(n) =10n^3 + 5n^2+17 ∈ θ(n^3)

1. First write the below statement


1. C1 g(n) <= f(n) <= C2 g(n)
2. Our task is to find C1,C2 and n0
3. We know that 10n^3 <=f(n)
4. So we can equate it to C1 g(n)
1. So we get C1= 10
5. Similarly for f(n) <=C2 g(n)
1. Give the highest degree to all terms
2. 10n^3+5n^2+17 -> 10n^3+5n^3+17n^3
3. After adding we get 32n^3
4. So C2= 32
6. Set n0 = 1

Problem 2

Prove that f(n) =2n^3 + 3n+79 ∈ θ(n^3)

1. First write the below statement


1. C1 g(n) <= f(n) <= C2 g(n)
2. Our task is to find C1,C2 and n0
3. We know that 2n^3 <=f(n)
4. So we can equate it to C1 g(n)
1. So we get C1= 2
5. Similarly for f(n) <=C2 g(n)
1. Give the highest degree to all terms
2. 2n^3+3n+79 -> 2n^3+3n^3+79n^3
3. After adding we get 84n^3
4. So C2= 84
6. Set n0 = 1

Problem 3

Prove that f(n) =10n^3 + nlogn ∈ θ(n^3)

1. First write the below statement


1. C1 g(n) <= f(n) <= C2 g(n)
2. Our task is to find C1,C2 and n0
3. We know that 10n^3 <=f(n)
4. So we can equate it to C1 g(n)
1. So we get C1= 10
5. Similarly for f(n) <=C2 g(n)
1. Give the highest degree to all terms
1. nlogn -> n^3
2. 10n^3 + nlogn -> 10n^3 + n^3
3. After adding we get 11n^3
4. So C2= 11
6. Set n0 = 1

Problem 4
1. Get the C1 value

1.
2. Get the C2 Value
1. Dont Consider the 20n^2

2.

3.

Big Oh Notation

Lets consider 2 functions


f(n) and g(n)
The argument n is input size
O(g(n)) = {f | f is a non negative function and there exist positive constants C2 and n0, such that
0 <= f(n) <= C2 g(n) ∀ n>=n0
}
n0 -> Input size boundary

Definition

Graph

Problems based on Big Oh Notations

Problem 1

3n^2 ∈ O(n^2)

Use the equation


1. f(n) <=C2 g(n)
2. 3n^2 <=C2 g(n)
3. C2 = 3

Problem 2

100n^2 + 20n+ 5 ∈ O(n^2)


1. f(n) <=C2 g(n)
2. 100n^2 + 20 n^2 + 5 n^2<=C2 g(n)
1. Apply highest degree everywhere
3. 125 n^2 = C2 g(n)
4. C2 = 125

Big Omega

Ω(g) = {f | f is non negative function, there exists positive constant c1 and n0, where
C1g(gn)<=f(n)
}

Definition

Problems based on Big Omega

Problem 1

100n^2 + 20n + 5 ∈ Ω(n^2)

Use the equation


1. C1g(gn)<=f(n)
1. 100n^2 <=f(n)
2. 100n^2 = C1g(n)
3. C1 = 100

Little oh notation
Here the upperbound is not tight as in Big Oh
In Big Oh Notiation the upperbound is

Where as in little oh notation

Definition

Problems based on Little oh notation

Problem 1

Let f(n) = 5n
Let g(n) = n^2

We need to use this formula here

When we apply this formula, we get


Cutting the n, Removing coefficient

Applying the limit

Problem 2

We apply the same formula as before


But We will get infinity/infinity, which is undefined

So we have to use L Hospitals Rule

When Applying L Hospitals rule


Little Omega Notation

Definition

Problems based on Little Omega

For Little Omega problems we need to use this formula

Problem 1
Apply the formula, And we will get infinity, which is what we need.

7. Time complexity,space complexity


Space Complexity

The space complexity of an algorithm is the amount of memory it needs to run to


completion
Space complexity = FIxed part + Variable Part
Fixed part
Its independent of the characteristics oof inputs and outputs
Variable part
Dependent on the characteristics of inputs and outputs

Example

Algorithm mAdd(m,n,a,b,c)
{
for i = 1 to m do
for j = 1 to n do
c[i,j] := a[i,j] + b[i,j]
}

Space Complexity = Space for parameters and space for local variables

m -> 1
n -> 1
a[] -> mn
b[] -> mn
c[] -> mn
i -> 1
j -> 1
Here the space complexity = 3mn + 4

Time Complexity

The time complexity of an algorithm is the amount of computer time it needs to run to
completion
Compilation time is excluded

8. Analysis of algorithms
Linear Search

Algorithm

Note

Except do while, all other loops execute n+1 times

1. n<-- length(A) -- 1
2. for i <-- 1 to n do -- n+1
1. if A[i] = key then -- n
1. found, break -- 1

3. end for -- 1

Best input

When Key (The value we need to search) is the first element


No of comparisons required = 1
Best case complexity O(1)

Worst case input

Key is the last element in list, or not found in list


No of comparisons = n
Worst case complexity O(n)

Average Case Input

All positions have equal probability


Involves probability distribution of the underlying input
To find the complexity of average case
First packet + Second packet .... = 1/n x 1 + 1/n x 2 + 1/n x3 + .... 1/n x n
= 1/n(1+2+3+4+...+ n)
= 1/n (n(n+1)/2) = 1/2n + 1/2
In (1/2) x n + 1/2, The largest number is n, so our complexity is O(n)

Insertion sort

Algorithm

Number Code Cost Count


1 n <-length (A) C1
1

2 for j = 2 to n do C2
n

3 key = A[j] C3
n − 1

4 i = j-1 C4
n − 1

5 while(i>0 and A[j]>key) then C5 n

∑ tj

j=2

6 A[i+1] = A[i] C6 n

∑ tj − 1

j=2

7 i = i-1 C7
n

∑ tj − 1

j=2

8 A[i+1] = key C8
n − 1
Number Code Cost Count
9 endof

Total time taken for execution of insertion sort

Sum of Cost x Count


T (n) = C1 + C2n + C3(n-1) + C4(n-1) + C5 ∑ n

j=2
tj + C6 ∑ n

j=2
tj − 1 + C7 ∑ n

j=2
tj − 1

+ C8(n-1)
Taking n as common
= n(C2+C3+C4+C8) + C5 ∑ + (C6+C7) ∑ + (C1-C3-C4-C8)
n n

j=2
tj j=2
tj − 1

On further simplification we get


C2n^2 + C1n + k
The largest number = n^2
= O(n^2)**
**

Best case analysis of insertion sort

Best case occurs when the input list given is sorted in ascended order
For each iteration of the for loop in step 2 observe that while statement in step 5 does not
hold true and statements 6 and 7 wont be executing

Hence the total running time, T(n)

Taking the statement 5


=∑
n n
∑ j=2 t j j=2
1

T(n) = an+b
This is a linear polynomial

Worst case

Worst case happens when its sorted in reverse order


Executes statement 5,6,7
T(n) = Cn^2 + C'n+C'''

Average case

This involves distribution of underlying input list.


On an average, half of the elements in the sorted list are greater than the key and they are
moved to their correct position
Statement 5 becomes
=∑
n n
∑ j=2 t j j=2
t j /2


n

j=2
t j /2 = 1/2 x (n(n+1)/2 - 1)

= 1/2 x (n(n+1)/2)
n
∑ j=2 t j /2 − 1

The average case complexity is O(n^2)

9. Best, Worst and Average Case Complexities


Best case: It is the minimum number of steps that can be executed for a given parameter
Worst Case: It is the maximum number of steps that can be executed for a given
parameter
Average Case: It is the average number fo steps that can be executed for a given
parameter

Example: Linear search

Best Case: Search data will be in first location of the array


Worst case: Search data does not exist in the array
Average Case: Search data is in the middle of the array

You might also like