Unit Test 1 Need of DS: Data Structure
Unit Test 1 Need of DS: Data Structure
Unit Test 1 Need of DS: Data Structure
NEED OF DS
● Data Organised (More efficient programs)
● Enhancing Logical thinking
● Optimising code
● Handle large amount of data
ORGANIZATION
Collection of record can be searched, processed in any
order or modified
Choice of dsa can make a program run in seconds or
days
– Solution said to be efficient when it solves
problem within its resource constraints
Space
Time
CLASSIFICATION OF DS
1)Primitive DS
2)Non Primitive Ds
ARRAY REPRESENTATION
(Draw diagram in exam)
– Arrays store fixed size sequential condition of
elements of same type
– They store collection of data of same type
STACK
(Draw Diagram In Exam)
– Stack store data in particular oder
– They perfrom only 2 operations: Push And Pop
– Push : Inserts Pop: Removes Last Element
– LIFO - Last In first out rule followed
QUEUE
(Draw diagram in exam)
– Linear structure follows particular order
– FIFO - first in first out
GRAPHS
(Draw diagram in exam)
– It contains finite set of vertices called NODES
– Finite set of ordered pair of form (u,v) called edge
TREES
– Hierarchical Ds
– Topmost Node = Root
– Elements are directly under an element are called
its children
– Elements are directly above are called its parent
OPERATIONS OF DS
Analysis Of Algorithms
. Logarithmic Loops
It contains either multiplication or dividing
during each iteration of the loop
for(I=0;i<100;i×=2) n = 100 , no. of iterations
= log 100 = approx. 10
for(I=100;i>=1;i /=2) f(n) = log n for both cases
. Nested Loop
we need to determine the number of iterations
each loop completes.
The total is then obtained as the product of the
number of iterations in the inner loop and the
number of iterations in the outer loop
Linear Logarithmic Loop: for(I=0;i<10;i++)
for(j=1;j<10;j++) f(n) = n log n
Quadratic Loop : for(I = 0; I<10; I++) for(j = 0;
j<10;j++) f(n) = n^2
ASYMPTOTIC NOTATIONS
. BIG O NOTATION
● It is expressed as O(n) O = Oder Of , meaning
what happens for very large values for n
● It describes upper bound for worst case input
combinations (possibly greater than worst case) .
● Best case = when array is already sorted
LIMITATIONS
– Many also are hard to analyse
mathematically
– Must Have sufficient information to
calculate behaviour of algorithms in
average case
– Big O analysis only tells how the algorithm
grows w size of program , not efficiency
– Ignores important constants
. Omega Notation
● It describes Lower bound for the best case
complexity of any algorithm
● Ex: f(n) = 3n+2 , > 3n for all n hence f(n) =
omega(n)
LIMITATIONs
– Function Can Never Do Better then
specified val but it may do worse as tight
–
lower bound
– It can never get better then specified val as
it describes lower bound for worst case
input combo
– It could be possibly greater thn best case
scenario.
– If we simply write omega it means same as
best case
. Theta Notation
● It provides asymptotically tight bound for f(n)
● f(n) grows between at growth rate that is both
upper & lower bound by same function
● It means its squeezed between 2 constants
multiplied as n becomes very large
● worst case theta notation not used , worst case
theta notation describes asymptotic bounds for
combination of input val
● If we simply write theta notation it means same as
worst case
2. Binary Search
● Works efficient ly with sorted array list
● Give example of dictionary
● The size of segment where search has to be
reduced to half so total number of comp will be
halved
● If flag variable is in middle best case , if greater
then middle index pick element to right via versa
● When match found return index val or else return
-1
Sorting
Arranging data in ascending and descending
order
Everything would be a mess and unsorted
stuff , fortunately sorting concept exists
Arranges data in sequence which makes
searching easier
Sorting efficiency
2 main criteria to judge an algorithm:
1) Time 2) memory rest there are many
different sorting algorithms
Bubble, selection , insertion, radix , heap ,
merge
Bubble Sort
– Repeatedly moving largest element to highest
index position of array
– Consecutive adjacent pair is compared
together
– Lower index greater then element at higher
index , they’ll be interchanged such tht the
lower element will be placed before the bigger
one
– With every complete iteration the largest
element bubbles up towards last place just
like water tht rises up to the surface
– Complexity of bubble sort is based on their
n-1 passes in total that goes up to O(n^2) , n =
total element in array
Insertion Sort
– Its less efficient as compared to other sorting
algorithms like quick heap merge
– Efficient to only small data and efficient
–
implementation on those data sets tht are alr
sorted
– Performs better algorithm then selection ,
bubble sort
– Twice as fast then bubble and almost 40%
fast then selection
– O(1) space required (less memory space)
– Worst time complexity[Big o] = O(n^2)
– Best time complexity[Omega] = O(n)
– Average time complexity [theta ]= O(n^2)
– Space toh alr know O(1)
– Stable sorting technique (docent change
order of elements)
Merge Sort
– Divide and rule policy
– Dividing = sorted into 2 sub arrays of n/2
element
– Conquer = Merging 2 sorted sub arrays of n/2
–
to produce sorted array elements
– Less Time to sort large list
– Array length = 0 or 1 (alr sorted)
– Divide unsorted arrays into 2 halves
– Using merge sort algorithm recursively to sort
each sub array
– In the end merge sub arrays to get original
sorted array
– Time Complexity = O(n*log n )
– Worst case time [Big O} = O(n× log n) =. Best
case = average case
– Space Complexity O(n)
– It requires linear time to merge 2 halves
– It requires equal amount of additional space
as unsorted array
– Used for sorting Linked List
Radix Sort
○ Its sorting is done on each of digits in the
number
○ Sorting least significant -> most significant
○ It contains n numbers & k is number of digits
in the largest number , radix sort algorithm is
called total of. K times and inner loop excuses
n times so in total O(kn) time to excuse
○ When radix sort is applied on a data set of
finite size (very small set of numbers), then
the algorithm runs in O(n) asymptotic time.
Heap Sort
○ Complete binary tree
○ Nodes = greater then / lesser then each of its
○
children
○ If Parent nodes are greater then child nodes =
Max heap(descending order)
○ If parent nodes are smaller then child nodes =
Min heap(ascending order)
○ Sorted array is obtained by removing largest/
smallest element from heap and inserting into
the array & is reconstructed after each
removal
○ Worst case = best case = average case =
O(n*log n)
Stack
◆ Push Operation
■ Used to insert an element on stack
■ Before inserting check top = max -1 if
yes then Stack Overflow so we cant
insert any Value
■ If false then the value of TOP and store
the new element at the position given
by stack [TOP].
◆ Pop Operation
■ Used To delete an element on stack
■ Before deleting check top=null if yes
then Stack Underflow then we cant
delete the element
■ If false then we decrement he value
printed by top
Applications Of Stack
Recursion
Conversion of an infix expression into a postfix
expression
Conversion of an infix expression into a prefix
expression
Evaluation of a Postfix and Prefix expression
Recursion
● Formula: n! =n*(n-1)
– Advantages Of Recursion
◆ Recursive solutions often tend to be shorter
and simpler than non-recursive ones.
◆ Code Is Easier to Use
◆ Divide and Conquer Rule
◆ It Works To the Similar to the original
formula to Solve a Problem
– Disadvantages Of Recursion
◆ For Programmers and readers, Its Taken As
Difficult Concept
◆ As Space Limited due to system Stack,
Going to deeper level may Sound difficult
◆ Difficult To Find Bugs
Recursion Iteration
1. Top - Down Approach 1. Bottom- Up Approach
2. Infinite Recursion can 2. Have Safer CPU
crash The System Cycles
3. Shorter Code 3. Longer Code
4. terminates when a 4. terminates when the
base case is recognized loop condition fails.
5. Uses More Memory 5. Uses Less Memory
Then Iteration Then Recusrion
Queues
Applications Of Queue
– Online Banking Fund Transfer Requests
– FCFS Job Scheduling
– Call Centre Phone System
– Serving requests on a single shared resource,
like a printer, CPU task
Circular Queues
Do Its Algorithm
Duque
○ Also Known as Double Ended Queue
○ 2 ends front and rear ,elements aligned
accordingly
○ New items can be added at either the front or
the rear. Likewise, existing items can be
removed from either end.
○ this hybrid linear structure provides all the
capabilities of stacks and queues in a single
data structure.
○ 2 pointers left and right are maintained , and
goes from right to left as circular queue
Dequeue Operations
Application Of Deqeue
● Storing a software application's list of undo
operations.
● Palindrome checker
Priority Queues
– Each element Is assigned priority Used to
determine the order of elements to be
processed
– Any element to be removed works like the
higher priority one is removed first
– Same Operations Insert delete and traverse
For same asc and desc also
– Basically works on 2 rules
1. An element with higher priority is
processed before an element with a lower priority.
2. Two elements with the same priority are
processed on a first-come-first-served (FCFS) basis.