DAA Module-1
DAA Module-1
DAA Module-1
PART A
4. Apply a quick sort algorithm and simulate it for the following data
sequence: 3 5 9 7 1 4 6 8 2.
5. Identify the tracing steps of merge sort and quicksort and analyse the
time complexity for the following data: 33, 44, 2, 10, 25
Quick Sort:
Time Complexity:
Time Complexity:
Using the traditional method, two matrices (X and Y) can be multiplied if the
order of these matrices are p × q and q × r. Following is the algorithm.
Algorithm:
Matrix-Multiplication (X, Y, Z)
for i = 1 to p do
for j = 1 to r do
Z[i,j] := 0
for k = 1 to q do
Z[i,j] := Z[i,j] + X[i,k] × Y[k,j]
There are three for loops in this algorithm and one is nested in the other.
Hence, the algorithm takes O(n3) time to execute.
Addition and Subtraction of two matrices takes O N2) time. So time complexity
can be written as
T(N) = 7T(N/2) + O(N2)
PART B
1. Define various asymptotic notations used for best case,average case and
worst case analysis of algorithms.
Following are the commonly used asymptotic notations to calculate the running
time complexity of an algorithm.
Ο f(n)) = { g(n) : there exists c > 0 and n0 such that f(n) ≤ c.g(n) for all n > n0. }
Ω f(n)) ≥ { g(n) : there exists c > 0 and n0 such that g(n) ≤ c.f(n) for all n > n0. }
● θ Notation: The notation θ(n) is the formal way to express both the lower
bound and the upper bound of an algorithm's running time.
θ(f(n)) = { g(n) if and only if g(n) = Ο f(n)) and g(n) = Ω f(n)) for all n > n0. }
Algorithm
Time Complexity
4. Explain quick sort algorithm and simulate it for the following data: 20, 35,
10, 16, 54, 21, 25
Algorithm
quickSort(array, leftmostIndex, rightmostIndex)
if (leftmostIndex < rightmostIndex)
pivotIndex <- partition(array, leftmostIndex, rightmostIndex)
quickSort(array, leftmostIndex, pivotIndex - 1)
quickSort(array, pivotIndex, rightmostIndex)
partition(array, leftmostIndex, rightmostIndex)
set rightmostIndex as pivotIndex
storeIndex <- leftmostIndex - 1
for i <- leftmostIndex + 1 to rightmostIndex
if element[i] < pivotElement
swap element [i] and element[storeIndex]
storeIndex++
swap pivotElement and element[storeIndex+1]
return storeIndex + 1
Same as the 3rd Question. Click here if you’re viewing in Google Docs.
6. Illustrate merge sort algorithms and discuss time complexity in both worst
case and average cases.
7. Explain the advantage of Strassen’s matrix multiplication when compared
to normal matrix multiplication for any two 16 x 16.
The example data structures whose operations are analysed using Amortized
Analysis are Hash Tables, Disjoint Sets and Splay Trees.
In the Hash-table, most of the time the searching time complexity is O 1 , but
sometimes it executes O(n) operations. When we want to search or insert an
element in a hash table for most of the cases it is constant time taking the task,
but when a collision occurs, it needs O(n) times operations for collision
resolution.
10. Organise sorted list of numbers using merge sort: 78, 32, 42, 62, 98, 12.
13. Define the Pseudo code conventions for specifying algorithms of
recursive and an iterative algorithm to compute n!.
14. Explain the frequency counts for all statements in the following algorithm
segment. i=1; while(i¡=n) do x=x+1; i=i+1.
15. What is a stable sorting method? Is merge sort a stable sorting method?
Justify
A sorting algorithm is said to be stable if two objects with equal keys appear in
the same order in sorted output as they appear in the input unsorted array.
Some Sorting Algorithms are stable by nature like Insertion Sort, Merge Sort
and Bubble Sort etc.
Sorting algorithms are not stable like Quick Sort, Heap Sort etc.
16. What is the Bubble sorting method? Is bubble sort a stable sorting
method? Justify
17. What is Quick sort? Is quicksort the best sorting method? Justify
QuickSort is a Divide and Conquer algorithm. It picks an element as pivot and
partitions the given array around the picked pivot.
The time complexity of Quicksort is O(n log n) in the best case, O(n log n) in
the average case, and O(n2) in the worst case. But because it has the best
performance in the average case for most inputs, Quicksort is generally
considered the “fastest” sorting algorithm.
1 it is in-place, i.e. it does not need extra memory when sorting a huge list
The time complexity of Quicksort is O(n log n) in the best case, O(n log n) in
the average case, and O(n2) in the worst case. But because it has the best
performance in the average case for most inputs, Quicksort is generally
considered the “fastest” sorting algorithm.
18. What is the Bubble sorting method? Is bubble sort a stable sorting
method? Justify
continuation….
5. Little Omega (ω) is a rough estimate of the order of the growth whereas Big
Omega Ω may represent exact order of growth.
20. Differentiate time and space complexity? Justify
PART C
A refer 9 ans part b click here if you are using Google Docs!
5. Find The best case and worst case analysis for linear search.
A Linear Search
● Best case: O 1
● Average case: O(n)
● Worst case: O(n)
7. Define the recurrence equation for the worst case behaviour of merge sort.
A If T(n) is the time required by merge sort for sorting an array of size n, then
the .
A O(n log n)
A A correct algorithm is one in which every valid input instance produces the
correct output. The correctness must be proved mathematically.
10.Define best case, average case and worst case efficiency of an algorithm.
Best case Efficiency: It is the minimum number of steps that an algorithm can
take any collection of data values.
A to calculate the running time, find the maximum number of nested loops
that go through a significant portion of the input.
17. What is meant by divide and conquer? Give the recurrence relation for
divide and conquer.
19. Find out any two drawbacks of the binary search algorithm.
2. Even if the array is sorted, the merge sort goes through the entire
process.