Algorithm
Algorithm
Published By:
Physics Wallah
ISBN: 978-93-94342-39-2
Website: www.pw.live
Email: support@pw.live
Rights
All rights will be reserved by Publisher. No part of this book may be used or reproduced in any manner
whatsoever without the written permission from author or publisher.
In the interest of student's community:
Circulation of soft copy of Book(s) in PDF or other equivalent format(s) through any social media channels,
emails, etc. or any other channels through mobiles, laptops or desktop is a criminal offence. Anybody
circulating, downloading, storing, soft copy of the book on his device(s) is in breach of Copyright Act. Further
Photocopying of this book or any of its material is also illegal. Do not download or forward in case you come
across any such soft copy material.
Disclaimer
A team of PW experts and faculties with an understanding of the subject has worked hard for the books.
While the author and publisher have used their best efforts in preparing these books. The content has been
checked for accuracy. As the book is intended for educational purposes, the author shall not be responsible for
any errors contained in the book.
The publication is designed to provide accurate and authoritative information with regard to the subject matter
covered.
This book and the individual contribution contained in it are protected under copyright by the publisher.
(This Module shall only be Used for Educational Purpose.)
Design Against Static Load
INDEX
1. Asymptotic Notation ........................................................................................................... 6.1 – 6.18
1 ASYMPTOTIC
NOTATION
Note:
To find the time complexity of an algorithm, find the loops and also consider larger loops.
Space complexity is dependent on two things input size and some extra space (stack space link, space list etc).
Worst Case
The input class for which the algorithm does maximum work and hence, take maximum time.
Best Case
The input class for which the algorithm does minimum work hence, take minimum time.
Average Case
Average case can be calculated form best case to worst case.
1.7.2 Ω - Notation
f (n) = ( g (n))
f (n) C g (n) (a b)
Example: 3n = (2n )
1.7.3 - Notation
If f(n) ≤ g(n)
And
f(n) ≥ g(n)
f(n) = g(n)
∴ f(n) = (g(n))
Example:
f(n) = 2n2, g(n) = n+10
f(n) > g(n) here
so, f(n) = Ω(g(n)) or g(n)=O(f(n))
= 1 + ( 2) + ( 3) + .......
½ ½
2 32
= n −1
3
2 32 2
= n −
3 3
1.5
= O(n )
= O(n n )
Example 3. Arrange the following functions in increasing order.
f1 = n log n, f2 = n , f3 = 2n , f4 = 3n , f5 = n!, f6 = nn , f7 = log n , f8 = 100n log n
→ f7 f 2 fl = f8 f3 f 4 f5 f6
2n 3n 4n n! nn
Question.
Which of following is TRUE?
(1) 2log2 n = O(n 2 ) TRUE
(2) n2 23log2 n = O(n5 ) TRUE
(3) 2n = O(22n ) TRUE
(4) log n = O(log log n) FALSE
(5) log log n = O(n log n) TRUE
Solution:
(1) 2log2 n = O(n 2 )
= nlog2 2
=n
GATE WALLAH COMPUTER SCIENCE & INFORMATION TECHNOLOGY HANDBOOK
6.6
Algorithm
= n = O(n2 )
(2) n2 23log2 n = O(n5 )
= n 2 n3log 2 2
= n 2 n3
= n5
= n5 = O(n5 )
(3) 2n = 22 n
2n = 2n.2n
2 n 22 n
2n = O(22n ) True
Algorithms
Solution.
Here 1 multiply, 1 division, 1 addition
⸫ O (1) [no loops, no recursion]
1.8.2. Iterative Algorithm Analysis
Example 1:
for (i =1; i ≤ n; i=i*2)
printf(“Sushil”)
Solution.
i=1, 2, 22, 23 ..., 2k
→
2k ≤ n
k log 2 ≤ log n
log n
k≤
log 2
⸫ k ≤ log 2 n
k = log2 n
So, this will execute log2 n +1 time and Complexity O ( log 2 n )
Example 2:
For (i=1; i ≤ n; i=i*3)
printf(“Aaveg”);
Solution.
So, this will execute log3 n + 1 time and complexity O ( log3 n )
➢ i = 1→2→4→8→16→...→n
i = n→n/2→n/4→n/8→...1
Example 3:
for (i = 1; i ≤ n; i++)
{
for (j=1; j ≤ 10; j++)
{
printf(“Dhananjay”);
}
}
Solution.
This will execute 10⸳n times and complexity O(n)
Example 4:
for (i = 1; i <= n; i = i*3)
for (j = 1; j ≤ n; j++)
printf(“Prapti”);
Solution.
( )
Total n log3 n + 1 time execute and Complexity = O ( n log3 n )
T (0) = C Constant
T(n) = C; n=0
T(n) = T (n - 1) + C; n > 0
Example 2:
void fun (in + n) T (n)
{
if (n > 0) C1 time
{
for (i = 1; i <= n; i + 1) n time
printf(“Hello”);
fun (n - 1); T (n - 1)
}
}
Solution.
T(n) = C1 + n - 1 + T (n - 1)
= T (n - 1) + n n>0
T (0) = C n=0
Example 3:
void fun (in + n) T(n)
{
if (n > 0) C1
{
for (i = 1; i < = n; i = i*2) log2 n
printf(“Divyajyoti”);
fun (n - 1); T (n - 1)
}
}
[ T (n - 2) + C] + C
T (n) = T (n - 2) + 2C
= T (n - 3) +3C
T (n) = T (n - k) + kC
⸫n–k=1
T (n) = T (1) + (n - 1) C
= C + (n - 1) C
T (n) = O (n)
Example (2)
T (n) = T (n - 1) + C⸳n
T (1) = C
Solution.
⸫ T (n) = T (n - 1) + C⸳n
= [T (n - 2) + C⸳(n – 1)] + C⸳n
= [T (n - 3) + C⸳(n – 2)] + C (n - 1) + C⸳n
= T (n - 3) + (n - 2)⸳C + (n - 1)⸳C + n⸳C
= T (n - k) + C (n – k +1) + C (n – k + 2) + ... + C (n – k + k)
⸫ n – k =1
T (n) = T (1) + T (2) + C (3) + C (4) + ... + C (n - 1) + C (n)
= C + C (2) + (3) C + 4 (C) + ... + (n - 1) C + (n)⸳C
= C [1 + 2 + 3 + ... + n]
(n + 1)
= C⸳n
2
2
= O (n )
Example (3)
T (n) = T (n/2) + C
T (1) = 1
Solution.
T (n) = T (n/2) + C
= [T (n/22) + C] + C
= T (n/4) + 2C
= T (n/23) + 3C
T (n) = T (n/2k) + kC
= (n/2k) = 2
T (n) = T (2) + ( log 2 n - 1) C
= 1 + ( log 2 n - 1) C
= O (log n)
Example (4)
T (1) = 1
T (n)= 2T (n/2) + C
n
Solution. T(n) = 2 2T 2 + C + C
2
n 2
= 22 T +2 C + C
22
n
= 22 2T 3 + C + 2C + C
2
n 2
= 23 T + 2 C + 2C + C
23
n k −1
= 2k T +2 C + 2k − 2 C + ... + 21 C + C
2k
n
k
= 1 n = 2k
2
→ T (n) = nT (1) + 2k - 1⸳ C + 2k - 2⸳ C + ... + 2C + C
= 2k + 2k - 1⸳ C + 2k – 2 + ... + 2C + C
= 2k + C (2k-1 + 2k-2 + ... + 21 + 20)
(2k − 1)
= 2k + C
2 −1
= 2k + C (2k - 1)
= 2k + 2k⸳ C – C
= n⸳C
= O (n)
If a > bk or logb a k
(
T (n) = Θ nlogb a )
n
Question 1. T (n) = 2T + (n)0log n
2
Solution. a = 2, b = 2, k = 0
a > bk; 2 > 20; 2 > 1
T (n) = Θ (n)
n
Question 2. T (n) = 2T + n
2
Solution. a = 2, b = 2, k = 1, p = 0
T(n) = Θ (n ⸳log n)
n
Question 3. T(n)=2 T + n log n
2
Solution. a = 2, b = 2, k = 1, p = 1
n
Question 4. T(n) = T + C
2
Solution.
T(n) = Θ (n2log n)
n
k
= 1 → n = 2k → k = log 2 n
2
Total Work done = C + 2C + 22C + 23C + ... + 2kC
= C (1 + 2 + 22 + ... + 2k)
2k +1 − 1
= c
2 − 1
= C (2k+1 - 1
= C (2⸳2k - 1)
= C (2n - 1)
= O (n)
n
(2) T (n) = 2T + n
2
n
k
− 1; n − 2k ; k − log 2 n
2
⸫ n + n + n + ... + n
=kn
= n nlog 2 n
= O(nlog 2 n)
n
(3) T (n) = 4T + n
2
n = 2k, k = log2 n
= n 1 + 2 + 22 + 23 + ... + 2k
2k +1 − 1
= n
2 − 1
(
= n 2 2k−1 )
= n ( 2n ) − 1
( )
= O n2
n 2n
(4) T (n) = T + T + n T(1) = 1
3 3
n
− 1; n − 3i; i − log3 n 3
3i
= n + n +...+ log3 n T (n)
= (n + n + n +...+ log3 n) ≥ n + ... + log3 n
Ω(n⸳ log 3/ 2 n )
Int n, A[n];
Algorithm Rsum(A, n)
{
if (n = 1) return (A(1));
else;
return (A[n] + RSum(A, (n–1));
}
Algorithm A(n)
{
if (n = 1) return;
else;
{
n
A ;
2
}
}
Recurrence relation
T(n) = C; n = 1
n
T(n) = T + C; n > 1
2
Time Complexity = O(log n)
Space Complexity
• Space complexity will depend on number of activation record pushed into the stack
Suppose, n = 16
A (1)
A (2)
For n = 2K we are pushing
A (4)
the ‘K’ activation record
A (8)
A (16)
Space Complexity
n = 2K
log n = Klog2 2
K = log 2 n
Space Complexity = O(log n)
Example 3
Algorithm A(n)
{
if (n = 2) return;
else;
return (A n );
}
Solution:
T (n) = 1; n =2
T (n) = T ( n ) +C; n > 2
Time Complexity = O (loglogn)
Space Complexity
Suppose n = 16
A(1)
A(2)
A(4)
A(16)
n
For 2 2k manner we are pushing in stack
n
2 2 2
k
n
log 2 2 log 2 2
2k
n 2K
K log 2 n
Recurrence Relation:
1 if n = 1 or n = 2
T (n) = n
2
2 T + 1; n 2
Time Complexity:
T (n) = O(n)
Space Complexity:
Recurrence relation:
1 if n = 1
T (n) = n
T 2 + 1; n 1
Time Complexity:
n
T ( n) = T + C
2
T (n) = O(log n)
Space Complexity:
Time Complexity:
Space Complexity:
Moves = m + n (Always)
Note:
Best Case comes in comparisons no effect on moves.
Note:
• In GATE exam if merge sort given then always consider outplace.
• If array size very large, merge sort preferable.
• If array size very small, then prefer insertion sort.
• Merge sort is stable sorting technique.
th
n
Example 1: In Quick for sorting n elements, the smallest element is selected as pivot. what is the worst-case time
16
Complexity?
Solution.
n 15n
T (n) = T +T +O (n)
16 16
= (solve by recursive tree method)
Example 2: The median of n elements can be found in O (n) time then, what is the time complexity of quick sort algo in
which median selected as pivot?
Solution.
T (n) = O (n) + C + O (n) + T (n/2) + T (n/2)
Time Complexity:
T(n) = O(n2)
Space complexity:
Space Complexity = O(n)
Quick sort Choose pivot element place in (nlogn) (nlogn) (n2) No Yes
correct position
Merge sort Divide to equal parts recursively sort (nlogn) (nlogn) = (nlogn) = n Yes No
each sub part & marge them n logn log n
Heap sort Build heap(max) delete max place (nlogn) (nlogn) (nlogn) No Yes
Selection sort Find position of min element (n2) (n2) (n2) No Yes
from [1 to n]
Insertion sort Insert a [i + 1] into correct position (n) (n2) (n2) Yes Yes
❑❑❑
3 GREEDY
TECHNIQUE
3.1 Greedy Technique
• Greedy method is an algorithm design strategy used for solving problems where solution are seen as result of making
a sequence of decisions.
• A problem may contain more than one solution.
3.2 Terminology
V(V-1)
Maximum edges =
2
V(V-1)
E
2
E C.V2 C is constant
Note:
E = O(V2)
Log E = O(logV)
3.8.2 Graph Representation
Graph Representation
• For more edges (Dense Graph) Adj. matrix is better (density more).
• For less edge (sparse graph) Adj list is better.
Matrix List
(1) Finding degree of vertex Time Complexity O(V) Every Case O(1) Best Case
O(V1) Worst Case
O(V2) Every Case O(V+2E) Worst Case
(2)Finding total edges Time Complexity O(V) Best Case
O(1) O(V-1) Worst Case
(3) Finding 2-vertices adjacent (or)not Time Complexity O(1) Best Case
2
• Using adjacency list & Unsorted Array = O(V )
3.9.2 Bellman-Ford
• Time Complexity = O(EV)
• If negative edge weight cycle then for some vertices Incorrect answer.
❑❑❑
4 DYNAMIC
PROGRAMMING
4.1 Dynamic Programming
In dynamic programming for optimal solution always computes distinct function calls.
4.2 Terminology
3. For i 0 to n − 1
For j 0 to m − 1
If ( p[i] = q[ j ]) then
L[i, j ] =1 + L (i − 1, j − 1];
else
L[i, j ] = max{L i , j − 1, L[i − 1, j ]}
}
• Time complexity of step 1 = O(n)
• Time complexity of step 2 = O(m)
• Time complexity of step 3 = O(mn)
• Total Time complexity = O(n) + O(m) + O(mn)
= O(mn)
• Space complexity = O[(M+1).(n+1)]
= O(mn)