AlgorithmAnalysis New
AlgorithmAnalysis New
Introduction
An algorithm is a clearly specified set of simple instructions to be followed to
solve a problem.
An algorithm that solves a problem but requires a year is hardly of any use.
Likewise, an algorithm that requires hundreds of gigabytes of main memory is not
(currently) useful on most machines.
It is not useful to measure how fast the algorithm runs as this depends on
which particular computer, OS, programming language, compiler, and kind
of inputs are used in testing
Instead,
Method call (Note: the execution time of the method itself may depend on the value of
parameter and it may not be constant)
Memory Access
4
Note: To simplify complexity analysis we will not consider memory access (fetch or
store) operations.
Problem Size
For every algorithm we want to analyze, we need to define the size of the
problem
For a search algorithm, the size of the problem is the size of the search pool
For a sorting algorithm, the size of the program is the number of elements
to be sorted
The efficiency of an algorithm is always stated as a function of the problem
size
We generally use the variable n to represent the problem size
5
Problem/Input size matters!
Some example algorithms and their expected running times based on
the input size
6
Growth Functions
7
Algorithm Complexity
8
Algorithm Complexity (cont…)
We are usually interested in the worst case complexity: what are the most
operations that might be performed for a given problem size.
time
9
Algorithm Complexity (cont…)
Example: Linear Search Complexity
Selection sort n2 n2
Inserstion sort n2 n2
10
Running Time Analysis
11
Asymptotic Complexity
12
Asymptotic Complexity (cont…)
We approximate f(n) by a function g(n) in a way that does not substantially
change the magnitude of f(n). The function g(n) is sufficiently close to f(n)
for large values of the input size n.
Thus the asymptotic complexity measure does not give the exact number
of operations of an algorithm, but it shows how that number grows with the
size of the input.
This gives us a measure that will work for different operating systems,
compilers and CPUs.
13
Notations
Following are the commonly used asymptotic notations to calculate the running time
complexity of an algorithm.
Ο Notation
Ω Notation
θ Notation
14
Theta(expression) consist of all the functions that lie in both O(expression) and
Omega(expression). When the upper and lower bounds are the same within a
constant factor, we indicate this by using θ(big-Theta) notation.
15
Big-Oh Notation
The most commonly used notation for specifying asymptotic complexity is the big-O
notation.
The coefficients and the lower order terms become increasingly less relevant as n
increases
So we say that the algorithm is order n2, which is written O(n2)
Two algorithms in the same category are generally considered to have the same
efficiency, but that doesn't mean they have equal growth functions or behave exactly
the same for all values of n
16
O Notation: Definition
Big O:
T(n) = O(f(n)) if there are positive constants c and N
such that T(n) ≤ c f(n) when n ≥ N
This says that function T(n) grows at a rate no faster than f(n) ; thus f(n)
is an upper bound on T(n).
Another way:
f(n) is O(g(n))↔ there exist numbers c, N > 0
such that for each n ≥ N
f(n) ≤ c.g(n)
The meaning:
• f(n) is larger than g(n) only for finite number of n’s;
• a constant c and a value N can be found so that for every value of n ≥ N: f(n) ≤ c.g(n);
• f(n) does not grow more than a constant factor faster than g(n).
17
O Notation: illustration
c. g(n)
f(n )
N n T(n) = O(f(n))
18
Big-Ω Notation: Definition
Big Omega:
T(n) = Ω(f(n)) if there are positive constants c
and N such that T(n) ≥ c f(n) when n ≥ N
This says that function T(n) grows at a rate no slower than f(n) ;
thus f(n) is a lower bound on T(n).
Another way:
f(n) is Ω(g(n))↔ there exist numbers c, N > 0
such that for each n ≥ N
f(n) ≥ c.g(n)
19
Big-Ω Notation: illustration
T(n) = Ω(f(n))
Big-θ Notation
When the upper and lower bounds are the same within a
constant factor, we indicate this by using Θ (big-Theta) notation.
This says that function T(n) grows at the same rate as f(n)
Another way:
c2 g(n)
f(n )
c1 g(n)
N n
Big-Oh Categories
24
Rules for using big-O
For large values of input n, the constants and terms with lower
Example:
25
T1(n) + T2(n) = max(O(f(n)), O(g(n)))
Example 1:
3. Multiplication Rule:
O(f(n) * h(n)) = O(f(n)) * O(h(n))
26
T1(n) * T2(n) = O(f(n)) * O(g(n))
Example:
T(n) = O(nk)
Example:
27
Comparing Growth Functions
28
Comparing Growth Functions (cont…)
29
Comparing Growth Functions (cont…)
30
Comparing Growth Functions (cont…)
31
How to determine complexity of code structures
32
Analyzing Loop Execution
Loops: for, while, and do-while:
First determine the order of the body of the loop, then multiply that by the
number of times the loop will execute
for (int count = 0; count < n; count++)
// some sequence of O(1) steps
33
Analyzing Loop Execution (cont…)
Loops: for, while, and do-while:
Again: complexity is determined by the number of iterations in the
loop times multiplied by the complexity of the body of the loop.
Examples:
34
Analyzing Loop Execution (cont…)
i=1;
while (i < n) { sum
= sum + i;
i = i*2 O(log n)
}
First of all, we should know the number of iterations of the loop; say it is x.
Then the loop condition is executed x + 1 times.
35
Analyzing Loop Execution (cont…)
Example:
Time Units to Compute:
int sum (int n) -----------------------•
1 for the assignment
{ int partial_sum = 0;
• Loop Statement: 1 assignment, n+1 tests, and
int i; n increments
for (i = 1; i <= n; i++) • Loop Body: n loops of 3 units for:
(an assignment, an addition, and multiplications)
partial_sum = partial_sum + (i * i);
• 1 for the return statement
return partial_sum; ------------------------
Total: 1 + (1 + n + 1 + n) + 3n + 1
}
= 5n + 4 = O(n)
36
Analyzing Loop Execution (cont…)
Loops (with <):
37
Analyzing Loop Execution (cont…)
Loops (with <=):
38
Analyzing Loop Execution (cont…)
double x, y; x = 2.5 ; y =
3.0; for(int i = 0; i < n;
i++){ a[i] = x * y; x = 2.5
* x; y = y + a[i]; }
the loop body that has three assignments, two multiplications, and an addition 6 n
operations
Thus the total number of basic operations is 6 * n + 2 * n + (n + 1) + 3 = 9n + 4
39
Analyzing Loop Execution (cont…)
Loops With Logarithmic Iterations:
40
Analyzing Nested Loops:
When loops are nested, we multiply the complexity of the outer loop
by the complexity of the inner loop
for (int count = 0; count < n; count++)
41
Analyzing Loop Execution (cont…)
Examples:
sum = 0
for(int i = 0; i < n; i++)
for(int j = 0; j < n; j++) O(n2)
sum += i * j ;
i = 1;
while(i <= n) { j = 1;
while(j <= n){ statements of constant
complexity; j = j*2;
O(n log n)
}
i = i+1;
}
42
Analyzing Loop Execution (cont…)
Example:
Example:
for(int i = 1; i <= n; i++)
for(int j = 1; j <= m; j++)
for(int k = 1; k <= p; k++) sum =
sum + i + j + k; 4pmn = O(pmn)
43
Analyzing Sequence of Statements
sum = sum + j;
sum = sum - l;
System.out.print("sum is now ” + sum);
Complexity is O(n2) + O(n) + O(1) = O(n2)
44
Analyzing Sequence of Statements (cont…)
Consecutive statements: Use Addition rule
Example:
45
Analyzing Sequence of Statements (cont…)
Consecutive statements: Use Addition rule
Example:
46
n2 + + 1 + n + n = O(n2 + 2n + 1) = O(n2)
47
48
Analyzing If Execution (cont…)
if (test) s1 else s2
The running time is never more than the running time of the test plus the
larger of the running times of s1 and s2
Example:
if (test == 1) O(1)
for (i = 1; i <= n; i++)
sum = sum + i;
O(n)
else
for (i = 1; i <= n; i++)
for (j = 1; j <= n; j++) O(n2)
sum = sum + i + j;
50
Analyzing If Execution (cont…)
Example (complexity of a code fragment which include if statement):
for(i = 1; i <= n; i++)
The Running Time:
= O(n3)
for(k = 1; k <= n; k++)
sum = sum + i + j + k;
if (test == 1)
for (i = 1; i <= n; i++)
for(j = 1; j <= n; j++) sum =
sum + i;
else for (i = 1; i <= n;
i++)
sum = sum + i + j;
51
Note: Sometimes a loop may cause the if-else rule not to be applicable.
Consider the following loop:
while (n > 0) { if
(n % 2 == 0) {
System.out.println(n);
n = n / 2;
} else
{
System.out.println(n);
System.out.println(n);
n = n – 1;
}
}
The else-branch has more basic operations; therefore one may conclude
that the loop is O(n). However the if-branch dominates. For example, if n is
60, then the sequence of n is: 60, 30, 15, 14, 7, 6, 3, 2, 1, and 0. Hence
the loop is logarithmic and its complexity is O(log n)
52
Switch Execution
Switch: Take the complexity of the most expensive case
char key;
int[] X = new int[n];
int[][] Y = new int[n][n];
........
switch(key) {
case 'a':
for(int i = 0; i < X.length; i++) o(n)
sum += X[i];
break;
case 'b':
for(int i = 0; i < Y.length; j++) o(n2)
for(int j = 0; j < Y[0].length; j++)
sum += Y[i][j];
break;
} // End of switch block
53
Analyzing Method Calls
54
Analyzing Method Calls (cont…)
Loop example:
Suppose n is a multiple of 2. Determine the number of basic operations performed by of the
method myMethod():
is log2n
55
Analyzing Method Calls (cont…)
Recursion:
Analyze from the inside (or deepest part) first and work outwards. If there are
function calls, these must be analyzed first.
Example:
long factorial (int n) Time Units to Compute:
{ if (n <= 1) return ------------------------
1; 1 for the test
1 for the multiplication statement
else return n * factorial (n – 1)
; } What about the function call?
56
Examples of Algorithms and their big-O
complexity
Big-O Notation Examples of Algorithms
57
58