Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
28 views

Algorithm Assignmenteeeeeee

1. Quicksort has best, average, and worst case time complexities of O(nlogn). It works by selecting a pivot element and partitioning the array into two sub-arrays based on whether elements are less than or greater than the pivot. 2. Merge sort follows the divide and conquer approach. It has a time complexity of O(nlogn) in all cases. It divides the array into halves recursively and then merges the sorted halves. 3. Heap sort uses a heap data structure that satisfies the max/min heap properties. It has a time complexity of O(nlogn) in best, average, and worst cases. It works by building a max/min heap

Uploaded by

kassahun gebrie
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
28 views

Algorithm Assignmenteeeeeee

1. Quicksort has best, average, and worst case time complexities of O(nlogn). It works by selecting a pivot element and partitioning the array into two sub-arrays based on whether elements are less than or greater than the pivot. 2. Merge sort follows the divide and conquer approach. It has a time complexity of O(nlogn) in all cases. It divides the array into halves recursively and then merges the sorted halves. 3. Heap sort uses a heap data structure that satisfies the max/min heap properties. It has a time complexity of O(nlogn) in best, average, and worst cases. It works by building a max/min heap

Uploaded by

kassahun gebrie
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 13

DEBREMARKOS UNIVERSITY BURIE CAMPUS

DEPARTMENT OF COMPUTER SCIENCE

ALGORITHM ANALYSIS GROUP


ASSIGNMENT
MEMBERS NAME ID

1. K A S S A H U N G E B R I E … … … … … . . 2 3 0 / 11
2. YA D E L E W L E WAY E … … … … … … 2 5 7 / 11
3. M E S E R E T M H I R T U … … … … … … 2 4 6 / 11
4. K E L E M I E G I TA H U N … … … … … . . 2 3 2 / 11
5. TA N T E Y G E G N E K A S I E … … … … … . 2 7 6 / 1 2
1, Write algorithm of queue and find the time complexity …?

Queue Representation
As we now understand that in queue, we access both ends for different reasons.
The following diagram given below tries to explain queue representation as data structure −

As in stacks, a queue can also be implemented using Arrays, Linked-lists, Pointers and Structures. For the
sake of simplicity, we shall implement queues using one-dimensional array.
Cont’ …

 Few more functions are required to make the above-mentioned queue operation efficient. These are −

peek() − Gets the element at the front of the queue without removing it.

isfull() − Checks if the queue is full.

isempty() − Checks if the queue is empty.

In queue, we always dequeue (or access) data, pointed by front pointer and while enqueing (or storing)

data in the queue we take help of rear pointer.

Let's first learn about supportive functions of a queue −


peek()
This function helps to see the data at the front of the queue. The algorithm of peek() function is as follows −

Algorithm
begin procedure peek
return queue[front]
end procedure
Cont’…
isfull()
As we are using single dimension array to implement queue, we just check for the rear pointer to reach
at MAXSIZE to determine that the queue is full. In case we maintain the queue in a circular linked-list,
the algorithm will differ. Algorithm of isfull() function −

Algorithm
begin procedure isfull

if rear equals to MAXSIZE


return true
else
return false
endif

end procedure
Cont’…

isempty()
Algorithm of isempty() function −

Algorithm
begin procedure isempty
 
if front is less than MIN OR front is greater than rear
return true
else
return false
endif
end procedure
If the value of front is less than MIN or 0, it tells that the queue is not yet initialized, hence empty.
2, Write algorithm of quick sort and find the time complexity …?

 Quicksort is a divide-and-conquer algorithm. It works by selecting a 'pivot' element from the


array and partitioning the other elements into two sub-arrays,
according to whether they are less than or greater than the pivot. For this reason, it is sometimes
called partition-exchange sort.
Cont’…
Complexity Analysis
The Best case time complexity of this Quick Sort algorithm is O(nlogn), the Worst case time complexity of
this algorithm is O(nlogn). Analysis of this complexity is described below:
Time Complexity

Time taken by quicksort, in general, can be written as follows:

T(n)=T(k)+T(n−k−1)+(n)

Here, the first two terms are the two recursive call and the last term is the partition of n elements. The time
taken by this algorithm depends on the input of the array and the partition process.
Best Case Analysis
The best-case occurs the algorithm is conducted in such a way that always the median element is selected as the
pivot and thus reduces the complexity. The following time is taken for the best case.
 T(n)=2T(n/2)+(n)

The solution of the above recurrence is O(nlogn).


Average Case Analysis

In
 average case analysis, we need to consider all possible permutations of an array and calculate the time taken by
every permutation. The average case is obtained by considering the case when partition puts O(n/9) elements in
one part and O(9n/10) elements in other parts. The following time is taken for this:
T(n)=T(n/9)+T(9n/10)+O(n)
Although the worst-case timeAnalysis
Worst Case complexity of Quick Sort is O(n2)

The proposed
 algorithm gives a better running time than a classical quick sort algorithm. The pivot selection

procedure is repeated for each iteration of the quick until the size of the array becomes less than or equal three. In

this case, we go for a manual sort where we compare two elements normally. There might be a situation where a

worst-case partitioning will be required. When the array will be already sorted or sorted in descending order then

worst case partitioning will be needed. Thus mean is calculated and it always comes between extreme values, so,

partitioning splits the list into 8-to-2. Thus, the time taken for the proposed algorithm is
T(n)=T(8n/10)+T(2n/10)+cn
Worst Case Time Complexity [ Big-O ]: O(n )
3, Write Merge sort algorithm and its time complexity …?

Merge Sort follows the rule of Divide and Conquer to sort a given set of numbers/elements, recursively, hence consuming less time
Merge sort , on the other hand, runs in O(n*log n) time in all the cases.
Divide
 and Conquer
If we can break a single big problem into smaller sub-problems, solve the smaller sub-problems and combine their solutions to find
the solution for the original big problem, it becomes easier to solve the whole problem.
Complexity Analysis of Merge Sort 

Merge Sort is quite fast, and has a time complexity of O(n*log n) .


It is also a stable sort, which means the "equal" elements are ordered in the same order in the sorted list.
Worst Case Time Complexity [ Big-O ]: O(n*log n)
Best
 Case Time Complexity [Big-omega]: O(n*log n)

 
Cont…
4, Write heap sort algorithm and its time complexity …?

 Heap is a special tree-based data structure that satisfies the following special heap properties:

 Shape Property: Heap data structure is always a Complete Binary Tree, which means all levels of

the tree are fully filled.

 Heap Property: All nodes are either greater than or equal to or less than or equal to each of

its children. If the parent nodes are greater than their child nodes, heap is called a Max-Heap, and if

the parent nodes are smaller than their child nodes, heap is called Min-Heap.
Cont…

 Complexity Analysis of Heap Sort

 Worst Case Time Complexity: O(n*log n)

 Best Case Time Complexity: O(n*log n)

 Average Time Complexity: O(n*log n)

 Space Complexity : O(1)

 Heap sort is not a Stable sort, and requires a constant space for sorting a list.

 Heap Sort is very fast and is widely used for sorting

You might also like