algorithms assignment 1
algorithms assignment 1
2. Algorithm Specification
o Specify the algorithm using pseudocode, flowcharts, or programming
languages.
3. Correctness Verification
o Ensure the algorithm correctly solves the problem for all inputs.
4. Performance Analysis
o Evaluate time complexity (speed) and space complexity (memory
usage).
5. Implementation & Testing
o Code the algorithm and test it with different inputs.
size 5.
Input size can be measured in terms of:
element).
O(log n) – Logarithmic time (e.g., binary search).
sorting).
O(2ⁿ) – Exponential time (e.g., recursion in
Fibonacci).
Asymptotic Notations
Asymptotic notations describe the efficiency of an algorithm in terms of time
and space complexity as the input size n grows. The three main notations are:
1. Big-O Notation (O)
Represents the upper bound of an algorithm’s running time.
It defines the worst-case complexity.
A function t(n) ∈ O(g(n)) if there exist constants c and n₀ such that:
Example:
2. Big-Omega Notation (Ω)
Represents the lower bound of an algorithm’s running time.
Example:
A function t(n) ∈ Θ(g(n)) if there exist constants c₁, c₂ and n₀ such that:
Example:
// INPUT:
// n → A non-negative integer
// OUTPUT:
// Returns the factorial of `n`
if n = 0 then
return 1 // Base case: 0! = 1
else
return Factorial(n - 1) * n // Recursive case: n! = (n - 1)! * nTime Complexity
Analysis:
The basic operation is multiplication.
O(1)T(n)=T(n−1)+O(1)
Expanding it: T(n)=T(n−1)+1
=T(n−2)+2=...=T(1)+(n−1)T(n)
= T(n-1) + 1 = T(n-2) + 2 = ...
= T(1) + (n-1)T(n)=T(n−1)+1=T(n−2)+2=...
=T(1)+(n−1)
This simplifies to T(n) = O(n).
// INPUT:
// n → Number of disks
// src → Source peg
// aux → Auxiliary peg
// dest → Destination peg
// OUTPUT:
// Sequence of moves to transfer `n` disks from `src` to `dest`
if n = 1 then
print "Move disk 1 from " src " to " dest
return
Hanoi(n - 1, src, dest, aux) // Move n-1 disks from src to aux
print "Move disk " n " from " src " to " dest
Hanoi(n - 1, aux, src, dest) // Move n-1 disks from aux to dest
Time Complexity Analysis:
The recurrence relation is: T(n)=2T(n−1)+O(1)T(n) = 2T(n-1) +
O(1)T(n)=2T(n−1)+O(1)
Expanding it: T(n)=2(2T(n−2)+1)+1=4T(n−2)+3T(n)
= 2(2T(n-2) + 1) + 1 = 4T(n-2) + 3T(n)
=2(2T(n−2)+1)+1=4T(n−2)+3
T(n)=8T(n−3)+7=...
=2n−1T(n)
= 8T(n-3) + 7 = ...
= 2^n - 1T(n)=8T(n−3)+7=...
=2n−1
This simplifies to T(n) = O(2^n), indicating exponential complexity.
searching).
3. Count the Number of Basic Operations
o Identify how many times the basic operation runs.
// OUTPUT:
// maxval → Maximum element in the array
for i ← 1 to n − 1 do
if A[i] > maxval then
maxval ← A*i+ // Update max if a larger element is found
5. What Brute force method ? Apply selection sort for the word
EXAMPLE to arrange in alphabetical order.
Ans:
1. Brute Force Method (5 Marks)
The brute force method is a straightforward approach to solving a problem by
systematically checking all possible solutions. It is often the simplest but least
efficient method.
1.1 Characteristics of Brute Force Approach
Simple and easy to implement.
No optimization – evaluates all possible outcomes.
calculation.
1.2 Examples of Brute Force Approach in Algorithms
1. Sorting Algorithms: Selection Sort, Bubble Sort.
2. Searching Algorithms: Sequential Search, Brute Force String Matching.
3. Mathematical Computations: Matrix Multiplication, Factorial Calculation.
// INPUT:
// A*+ → Array of `n` elements
// OUTPUT:
// A*+ → Sorted array in non-decreasing order
for i ← 0 to n − 2 do
minIndex ← i // Assume the first element is the smallest
Space Complexity:
6. With the algorithm derive the worst-case efficiency for Bubble sort.
Apply bubble sort on these elements. 45, 20, 40, 5, 15
Ans:
1. Bubble Sort Algorithm (5 Marks)
Bubble Sort is a brute force sorting algorithm that repeatedly swaps adjacent
elements if they are in the wrong order.
ALGORITHM BubbleSort(A*0..n − 1+)
// Sorts a given array A*0..n − 1+ using bubble sort
INPUT:
A* + → Array of `n` elements
OUTPUT:
A* + → Sorted array in non-decreasing order
for i ← 0 to n − 2 do
for j ← 0 to n − 2 − i do
if A[j + 1] < A[j] then
swap A[j] and A[j + 1]
Working Mechanism
Pass 1: Largest element bubbles to the last position.
Final Complexity
Bubble Sort is inefficient for large inputs, making it only useful for small
datasets.
3. Applying Bubble Sort to Given List: [45, 20, 40, 5, 15] (5 Marks)
Initial List: [45, 20, 40, 5, 15]
Pass Comparisons & Swaps Result after Pass
INPUT:
A* + → Array of `n` elements
K → Search key
OUTPUT:
Index of `K` in `A` OR `-1` if not found
i←0
while i < n and A*i+ ≠ K do
i←i+1
if i < n then return i
else return −1
Working Mechanism:
1. Start from the first element and check each item sequentially.
2. If a match is found, return the index.
3. If the end of the list is reached, return -1 (not found).
list.
The loop runs n times, leading to T(n) = O(n).
n1 ← mid - low + 1
n2 ← high – mid
for i ← 0 to n1 - 1 do
L*i+ ← A*low + i+
for j ← 0 to n2 - 1 do
R*j+ ← A*mid + 1 + j+
i ← 0, j ← 0, k ← low
while i < n1 do
A*k+ ← L*i+
i←i+1
k←k+1
while j < n2 do
A*k+ ← R*j+
j←j+1
k←k+1
2.2 Working of Merge Sort
1. Divide: Recursively split the array into two halves.
2. Conquer: Sort both halves using recursive calls.
3. Combine: Merge the sorted halves into a single sorted array.
for i ← 1 to n - 1 do
key ← A*i+
j←i-1
A*j + 1+ ← key
Working Mechanism:
1. Start from the second element (A[1]) and compare it with previous
elements.
2. Shift larger elements one position right to make space for the key
element.
3. Insert the key element in the correct position.
4. Repeat the process for all elements until the array is sorted.
shifted).
Total comparisons and shifts:
2.3 Average-Case Complexity (O(n²))
Randomly ordered list requires shifting for about half the elements.
3. Applying Insertion Sort on Given List: [89, 45, 68, 90, 29, 34, 17] (5 Marks)
Initial List: [89, 45, 68, 90, 29, 34, 17]
Pass Key Comparisons & Shifts Result after Pass
1 45 89 → Shift [45, 89, 68, 90, 29, 34, 17]
2 68 No shift [45, 68, 89, 90, 29, 34, 17]
3 90 No shift [45, 68, 89, 90, 29, 34, 17]
4 29 90 → 89 → 68 → 45 → Shift [29, 45, 68, 89, 90, 34, 17]
5 34 90 → 89 → 68 → 45 → Shift [29, 34, 45, 68, 89, 90, 17]
6 17 90 → 89 → 68 → 45 → 34 → 29 → Shift [17, 29, 34, 45, 68, 89, 90]
Final Sorted List: [17, 29, 34, 45, 68, 89, 90]
10. Design an algorithm for quick sort algorithm. Apply quick sort on
these elements. 25,75,40,10,20,05,15.
Ans:
1. Quick Sort Algorithm (5 Marks)
Quick Sort is a Divide and Conquer sorting algorithm that:
1. Picks a pivot element from the array.
2. Partitions the array such that elements smaller than the pivot go left and
larger elements go right.
3. Recursively sorts the left and right subarrays.
ALGORITHM QuickSort(A[low..high])
// Sorts an array A[low..high] using quick sort
ALGORITHM Partition(A[low..high])
// Partitions A[low..high] around a pivot
pivot ← A*high+
i ← low - 1
for j ← low to high - 1 do
if A[j] < pivot then
i←i+1
swap(A[i], A[j])
3. Applying Quick Sort on Given List: [25, 75, 40, 10, 20, 5, 15] (5 Marks)
Step-by-Step Execution
Initial List: [25, 75, 40, 10, 20, 5, 15]
Step Pivot Partitioned List Left Subarray Right Subarray
1 15 [10, 5, 15, 25, 75, 40, 20] [10, 5] [25, 75, 40, 20]
2 5 [5, 10] Sorted Unsorted
3 20 [5, 10, 15, 20, 25, 75, 40] [Sorted] [25, 40, 75]
4 40 [5, 10, 15, 20, 25, 40, 75] [Sorted] [Sorted]
Final Sorted List: [5, 10, 15, 20, 25, 40, 75]
matrices.
Strassen’s algorithm reduces the number of multiplications by
method).
1.2 Matrix Partitioning
Given two matrices A and B, they are divided into four submatrices each:
where:
7T(n/2): Recursively multiplying 7 submatrices of size (n/2) × (n/2).
where:
a = 7, b = 2, d = 2.
= 2.
Since log_2 7 > d, the complexity is:
2.3 Comparison with Conventional Matrix Multiplication
Algorithm Multiplications Additions/Subtractions Time Complexity
Standard Method n³ O(n²) O(n³)
Strassen’s Method 7T(n/2) O(n²) O(n^{2.81})
Thus, Strassen’s algorithm is asymptotically faster than conventional matrix
multiplication for large n.
12. What is topological sequence? Explain DFS and Source removal method with
suitable example to solve topological sequence
Ans:
1. Topological Sorting (5 Marks)
A topological sequence (or topological ordering) of a Directed Acyclic Graph
(DAG) is a linear ordering of its vertices such that for every directed edge (u →
v), vertex u appears before vertex v in the ordering.
1.1 Properties of Topological Sorting
✅ Only applicable to DAGs (graphs without cycles).
✅ Multiple valid topological orderings may exist.
✅ Used in scheduling tasks, dependency resolution, and precedence
constraints.
1.2 Example DAG & Topological Ordering
Consider a directed graph:
A→B→D
↓ ↓
C→E
A valid topological ordering of vertices:
A,C,B,E,DA, C, B, E, DA,C,B,E,D
When visiting a node, all its adjacent vertices are recursively visited first.
The node is then pushed onto a stack, ensuring it appears after its
dependencies in the final ordering.
2.2 Steps for DFS-based Topological Sorting
1. Start DFS traversal from any unvisited vertex.
2. Visit all adjacent vertices recursively.
3. Push the vertex to a stack once all adjacent vertices are visited.
4. Repeat for all vertices until all are visited.
5. Pop vertices from the stack to get a valid topological order.
2.3 Example
For the given DAG:
A→B→D
↓ ↓
C→E
DFS Visit Order:
1. Start from A → Visit B → Visit D → Push D to stack.
2. Visit E from B → Push E to stack.
3. Push B to stack.
4. Visit C → Push C to stack.
5. Push A to stack.
Final Topological Order (from stack):
A,C,B,E,DA, C, B, E, DA,C,B,E,D
2.4 Time Complexity
DFS traversal takes O(V + E) (where V is vertices and E is edges).
o Updated in-degrees: B: 0, C: 0, D: 1, E: 2
weight limit.
1.1 Given Instance of Knapsack Problem
Item 1 2 3 4
Weight (w[i]) 2 1 3 5
Value (v[i]) 12 10 20 5
{} (Empty) 0 0 ✅
{1} 2 12 ✅
{2} 1 10 ✅
{3} 3 20 ✅
{4} 5 5 ✅
{1, 2} 3 22 ✅
{2, 3} 4 30 ✅
14. What is string matching? With the algorithm derive the worst-
case efficiency.
Ans:
String matching is the process of finding occurrences of a pattern (P) of length
m in a text (T) of length n.
1.1 Definition
Given:
A text string T[0...n-1] of length n.
found
o P = "AAAA".
o Here, for each shift, the algorithm checks all m characters before
failing.
3.2 Time Complexity Analysis
For each shift, the worst case requires O(m) comparisons, and we may shift up
to (n - m + 1) times.