Algorithm Analysis
Algorithm Analysis
Algorithm Analysis
Prepared By
Mehak Usmani
Reference:
Data Structures and Algorithm Analysis in Java by Mark A. Weiss 1
Objectives
You will learn about;
• Algorithm
• Properties of Algorithm
• Complexity
• Asymptotic Analysis
• Big Oh
• Recursion
2
Algorithm
Any well-defined computation procedure that takes
some value, or set of values, as input and produces
some value, or set of values, as output
Input Output
Algorithm
3
Algorithm
A well-ordered collection of unambiguous and
effectively computable operations that, when,
executed, produces a result and halts in a finite amount
of time.
4
Properties of an Algorithm
• Input -An algorithm has input values from a specified set.
• Correctness -An algorithm should produce the correct output values for
each set of input values.
5
Properties of an Algorithm
• Finiteness -An algorithm should produce the desired output after a finite
(but perhaps large) number of steps for any input in the set.
6
Algorithm Analysis
• The task of determining how much computing time and storage an
algorithm requires.
• How good is the algorithm?
– Correctness
– Time efficiency
– Space efficiency
• How to analyze algorithms?
– theoretical analysis (Mathematical)
– empirical analysis
– It needs mathematical skills.
• Best case, average case, and worst case.
7
Correctness
• An algorithm is said to be correct if, for every input instance, it halts with
the correct output.
• A correct algorithm solves the given computational problem.
• Partial correctness requires that if an answer is returned it will be correct.
• Total correctness additionally requires that the algorithm terminates
• An incorrect algorithm might not halt at all on some input instance, or it
might halt with other than the desired answer.
8
Space Complexity
• Space complexity is the amount of memory used by the algorithm
(including the input values to the algorithm) to execute and produce the
result.
• Algorithm uses memory space for three reasons
– Instruction Space
– Environmental Stack
– Data Space
• Memory used by different data types
• How much memory resource, algorithm will require?
• We usually consider only Data Space for space complexity
9
Time Complexity
• Time complexity of an algorithm signifies the total time required by the
program to run till its completion.
• Time efficiency is analyzed by determining the number of repetitions of
the basic operation as a function of input size.
• Basic operation: the operation that contributes most towards the running
time of the algorithm
• Time Efficiency is speed
• How long an algorithm takes to produce its result?
• We usually use the worst-case time complexity
10
Cases
• Best case is the function which performs the minimum number of steps
on input data of n elements.
• Worst case is the function which performs the maximum number of steps
on input data of size n.
• Average case is the function which performs an average number of steps
on input data of n elements.
11
Asymptotic Analysis
• Refers to computing the running time of any operation in mathematical
units of computation
• It is input bound
• If there's no input to the algorithm, it is concluded to work in a constant
time
12
Asymptotic Notations
• Following are the commonly used asymptotic notations to calculate the
running time complexity of an algorithm
– Big Oh O()
– Big Omega Ω()
– Big Theta θ()
13
Asymptotic Analysis
• T(N) = O(f(N))
– if there are positive constants c and n0 such that T(N) ≤ cf(N) when N ≥ n0
• T(N) = Ω (g(N))
– if there are positive constants c and n0 such that T(N) ≥ cg(N) when N ≥ n0
• T(N) = θ (h(N))
– if and only if T(N) = O(h(N)) and T(N) = Ω(h(N))
14
Relative rates of growth
• The idea of previous definitions is to establish a relative order among
functions. This is the important measure in analysis.
• It is not the exact number of basic ops executed for given n rather it is how
number of basic operations grows as n increases
• T(N) = O(f(N)) says that the growth rate of T(N) is less than or equal to (≤)
that of f(N)
• T(N) = Ω (g(N)) says that growth rate of T(N) is greater than or equal to (≥)
that of g(N).
• T(N) = θ (h(N)) says that the growth rate of T(N) equals (=) the growth rate
of h(N)
15
Big Oh
• It measures the worst case time complexity
• The longest amount of time an algorithm can possibly take to complete
• Defines upper bound
16
Omega
• It measures the best case time complexity
• The least amount of time an algorithm can possibly take to complete
• Defines lower bound
17
Theta
• It measures average case time complexity.
• Consider all possible input sets of size n, average C(n) for all sets
• Express both the lower bound and the upper bound of an algorithm's
running time
18
Order of growth
• O(1) describes an algorithm that will always execute in the same time (or
space) regardless of the size of the input data set – Constant
19
Order of growth
• O(2N) denotes an algorithm whose growth doubles with each addition to
the input data set – Exponential
– Recursion
20
Order of growth
Growth rates
21
Order of growth
22
Computing a Big-Oh running time
• Algorithm analysis using Big-Oh notation can be done using following;
– All sorts of shortcuts are possible.
– There are no particular units of time
– Lower-order terms can be ignored,
– Constants can be thrown away.
23
Example
1 unit
1+(N+1)+N
4N units
1 unit
24
Example
1 + ( 1 + (N+1) + N ) + 4N + 1
1+ 2 + 2N + 4N +1
4+6N
O(N)
25
General Rules
• Sequence of statements
– the time for each statement is constant and the total time is also
constant: O(1)
• if-then-else statements
– the time for the whole if-then-else statement would be worst of both
• Loop statements
– The loop executes N times, the sequence of statements also executes
N times
• Nested loops
– If inner loop is independent of the value of the outer loop's index then
loop executes N*M times
– Other wise inner loop also executes N times so N*N
26
Recursion
It is a method of solving a problem where the solution depends on solutions
to smaller instances of the same problem.
27
Recursive Method
• A method that is defined in terms of itself is called recursive.
• Recursive method f ought to be expressible in only a few lines, just like a
non-recursive method .
• Recursive calls will keep on being made until a base case is reached
• Base Case: The value for which the function is directly known without
resorting to recursion
28
Recursive Method
29
Analyzing Recursive Methods
Factorial: 5! 5.4.3.2.1
• If the recursion is really just a thinly veiled for loop, the analysis is usually
trivial. For instance, the following method is really just a simple loop and is
O(N):
30
Analyzing Recursive Methods
Fibonacci Series: 1, 1, 2, 3, 5, 8 . . . .
31
Recurrence Relation
• A recurrence relation is an equation which is defined in terms of itself
• Many algorithms are recursive in nature. When we analyze them, we get a
recurrence relation for time complexity.
• When it is difficult to convert the recursion into a simple loop structure,
the analysis will involve a recurrence relation that needs to be solved.
T(n) = 2T(n/2) + cn
32
Analyzing Recursive Methods
𝑛
𝑂 (2 )
33
Analyzing Recursive Methods
• By induction it can be proved that the running time of this program grows
exponentially
• By keeping a simple array and using a for loop, the running time can be
reduced substantially
• This program is slow because there is a huge amount of redundant work
being performed [f(n-1) and f(n-2) ]
• It is generally a bad idea to use recursion to evaluate simple mathematical
functions, such as the Fibonacci numbers
• The main problem with recursion is the hidden bookkeeping costs
34
Rules of Recursion
Four fundamental rules of recursion are:
1. Base cases: You must always have some base cases, which can be solved
without recursion.
2. Making progress: For the cases that are to be solved recursively, the
recursive call must always be to a case that makes progress toward a
base case.
3. Design rule: Assume that all the recursive calls work.
4. Compound interest rule: Never duplicate work by solving the same
instance of a problem in separate recursive calls
35
Review Questions
Question 1:
Describe properties of an algorithm?
Question 2:
What is base case in recurrsion?
36