Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
16 views

Week 8 Algorithm Analysis

The document discusses algorithm efficiency analysis and big O notation. It explains how to measure algorithm efficiency, defines best case, worst case and average case, and provides examples of common time complexities like constant, logarithmic, linear, quadratic and exponential time. The document also shows how to determine algorithm complexity theoretically by calculating steps and practically by implementation and experimentation.

Uploaded by

Muneeb Ahmad
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views

Week 8 Algorithm Analysis

The document discusses algorithm efficiency analysis and big O notation. It explains how to measure algorithm efficiency, defines best case, worst case and average case, and provides examples of common time complexities like constant, logarithmic, linear, quadratic and exponential time. The document also shows how to determine algorithm complexity theoretically by calculating steps and practically by implementation and experimentation.

Uploaded by

Muneeb Ahmad
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 39

Algorithm Efficiency

Analysis
Objectives
At the end of the class, students are expected
to be able to do the following:

• Know how to measure algorithm efficiency.


• Know the meaning of big O notation.
Introduction

Algorithm analysis:-
to study the efficiency of algorithms when the
input size grow, based on the number of
steps, the amount of computer time and
space.
Analysis of algorithms

• Is a major field that provides tools for evaluating the


efficiency of different solutions

What is an efficient algorithm?


• Faster is better (Time)
– How do you measure time? Wall clock? Computer clock?
• Less space demanding is better (Space)
– But if processor need to get data out of main memory
it takes time
Analysis of algorithms

• Algorithm analysis should be independent of :


– Specific implementations and coding tricks
(programming language, control statements –
Pascal, C, C++, Java)
– Specific Computers (hw chip, OS, clock speed)
– Particular set of data (string, int, float)

But size of data should matter


Analysis of algorithms
• Three possible states in algorithm analysis:
- best case
- average case
- worst case

• The worst case is always considered  the


maximum boundary for execution time or memory
space for any input size.
• Execution time for the worst case  complexity
time
Worse Case/Best Case/Average Case

For a particular problem size, we may be interested in:


• Worst-case efficiency: Longest running time for any input of
size n
– A determination of the maximum amount of time that an algorithm
requires to solve problems of size n
• Best-case efficiency: Shortest running time for any input of
size n
– A determination of the minimum amount of time that an algorithm
requires to solve problems of size n
• Average-case efficiency: Average running time for all inputs
of size n
– A determination of the average amount of time that an algorithm
requires to solve problems of size n
Examples of the 3 cases

Algorithm: sequential search of n elements


• Best-case: Find the target in the first place the
element set. C(n) = 1
• Worst-case: Find or cannot find the target
after compare every element with the target
value. C(n) = n
• Average-case: Depends on the probability (p)
that the target will be found. C(n) =n/2
Big O Notation
• Complexity time can be represented by
Big ‘O’ notation.
• Big ‘O’ notation is denoted as
O(f(n))
O – “on the order of”
f(n)- algorithm’s growth-rate function that
may consist of 1, logxn, n, n logxn,
n2, …
• An algorithm requires time proportional to
f(n). O(f(n)) means order of f(n).
Big O Notation
• Notation that used to show the complexity
time of algorithms.
Notation Execution time / number of step
O(1) Constant function, independent of input size, n
Example: Finding the first element of a list.
O(logxn) Problem complexity increases slowly as the problem size
increases.
Squaring the problem size only doubles the time.
Charac.: Solve a problem by splitting into constant fractions
of the problem (e.g., throw away ½ at each step)
O(n) Problem complexity increases linearly with the size of the
input, n
Example: counting the elements in a list.
Big O Notation
Log-linear increase - Problem complexity increases a little
O(n logxn) faster than n
Characteristic: Divide problem into subproblems that are
solved the same way
Example: mergesort
Quadratic increase.
O(n2)
Problem complexity increases fairly fast, but still manageable
Characteristic: Two nested loops of size n
Cubic increase.
O(n3)
Practical for small input size, n.
Exponential increase - Increase too rapidly to be practical
O(2n)
Problem complexity increases very fast
Generally unmanageable for any meaningful n
Example: Find all subsets of a set of n elements
Big O Notation

• Example of algorithm (only for cout operation):

notation code
O(1) int counter = 1;
cout << "Welcome to C++ " << counter <<
Constant
"\n";
O(logxn) int counter = 1; int i = 0;
Logarithmic for (i = x; i <= n; i = i * x) { // x must be
> than 1
cout << "Welcome to C++ " <<
counter << "\n";
counter++;
}
Order of increasing complexity
Order of growth for some common function:
• O(1) < O(logxn) < O(n) < O(n log2n) < O(n2) < O(n3) < O(2n)

Notasi n=8 n = 16 n = 32
O(log2n) 3 4 5
O(n) 8 16 32
O(n log2n) 24 64 160
O(n2) 64 256 1024
O(n3) 512 4096 32768
O(2n) 256 65536 4294967296
Order-of-Magnitude Analysis and
Big O Notation
2n n3 n2 n * log2n
growth-rate function value

log2n
1
n
Big O Notation
• Example of algorithm for common function:
O(n) int counter = 1; int i = 0;

Linear
for (i = 1; i <= n; i++) {
cout << "Welcome to C++ " << counter << "\n";
counter++;
}

O(n int counter = 1; int i = 0; int j = 1;


logxn)
for (i = x; i <= n; i = i * x) { // x must be > than 1
Linear
while (j <= n) {
Logarith
cout << "Welcome to C++ " << counter << "\n";
mic counter++; j++;
} j=1;
}
Big O Notation
• Example of algorithm for common function:

O(n2) int counter = 1;


Quadratic int i = 0;
int j = 0;

for (i = 1; i <= n; i++) {


for (j = 1; j <= n; j++) {
cout << "Welcome to c++ " << counter << "\n";

counter++;
}
}
Big O Notation
• Example of algorithm for common function:

O(n3) int counter = 1;


int i = 0;
Cubic
int j = 0;
int k = 0;

for (i = 1; i <= n; i++) {


for (j = 1; j <= n; j++) {
for (j = 1; j <= n; j++) {
cout << "Welcome to C++ " <<
counter << "\n";
counter++;
}
}
}
Big O Notation
• Example of algorithm for common function:

O(2n) int counter = 1;


Exponential int i = 1;
int j = 1;

while (i <= n) {
j = j * 2;
i++;
}

for (i = 1; i <= j; i++) {


cout << "Welcome to C++ " << counter
<< "\n";
counter++;
}
Determine the complexity time of
algorithm
Can be determined
• theoretically – by calculation
• practically – by experiment or
implementation
Determine the complexity time of
algorithm - practically

– Implement the algorithms in any


programming language and run the
programs
– Depend on the compiler, computer, data
input and programming style.
Determine the complexity time of
algorithm - theoretically
• The complexity time is related to the number
of steps /operations.
• Complexity time can be determined by
1. Count the number of steps and then find the class
of complexity.

2. Find the complexity time for each steps and then


count the total.
Determine the number of steps
• The following algorithm is categorized as O(n).

int counter = 1;
int i = 0;
for (i = 1; i <= n; i++) {
cout << "Welcome to C++ “;
cout << counter << "\n";
counter++;
}
Determine the number of steps

Num statements
int counter = 1;
1
int i = 0;
2
i=1
3
i <= n
4
i++
5
6 cout << "Welcome to C++ " << counter << "\n"

counter++
7
Determine the number of steps
• Statement 3, 4 & 5 are the loop control and
can be assumed as one statement.
Num Statements

int counter = 1;
1
int i = 0;
2
i = 1; i <= n; i++
3
6 cout << "Welcome to C++" << counter << "\n"

counter++
7
Determine the number of steps-
summation series
• statement 3, 6 & 7 are in the repetition
structure.
• It can be expressed by summation series
n
∑ f(i) = f(1) + f(2) + . . . + f(n) = n
i=1
Where
f(i) – statement executed in the loop
Determine the number of steps-
summation series
• example:- if n = 5, i = 1

5
∑ f(i) = f(1) + f(2) + f(3) + f(4) + f(5) = 5
i=1

The statement that represented by f(i) will be repeated


5 times
Determine the number of steps-
summation series
• example:- if n = 5, i = 3

5
∑ f(i) = f(3) + f(4) + f(5) = 3
i=3

The statement that represented by f(i) will be repeated


3 time
Determine the number of steps-
summation series
• Example: if n = 1, i = 1

1
∑ f(i) = f(1) = 1
i=1

The statement that represented by f(i) will be executed


only once.
Determine the number of steps

statements Number of steps


int counter = 1; 1
∑ f(i) = 1
i=1
int i = 0; 1
∑ f(i) = 1
i=1
i = 1; i<= n; i++ n
∑ f(i) = n
i=1
cout << "Welcome to C++ " << counter << n 1
"\n" ∑ f(i) . ∑ f(i) = n . 1 = n
i=1 i=1
counter++ n 1
∑ f(i) . ∑ f(i) = n . 1 = n
i=1 i=1
Determine the number of steps

• Total steps:

1 + 1 + n + n + n = 2 + 3n

Consider the largest factor.


• Algorithm complexity can be categorized as
O(n)
Determine the number of steps

__ __ __
Determine the number of steps
Determine the number of steps
- exercise
Count the number of steps and find the Big ‘O’
notation for the following algorithm
int counter = 1;
int i = 0;
int j = 1;

for (i = 3; i <= n; i = i * 3) {
while (j <= n) {
cout << "Welcome to C++ " << counter << "\n";
counter++;
j++;
}
}
Determine the number of steps
- solution
statements Number of steps

int counter = 1; 1
∑ f(i) = 1
i=1
int i = 0; 1
∑ f(i) = 1
i=1
int j = 1; 1
∑ f(i) = 1
i=1
i = 3; i <= n; i = i * 3 n
∑ f(i) = f(3) + f(9) + f(27) + … + f(n) = log3n
i=3

j <= n n n
∑ f(i) . ∑ f(i) = log3n . n
i=3 j=1
Determine the number of steps
- solution
cout << "Welcome to C++ " n n 1
<< counter ∑ f(i) . ∑ f(i) . ∑ f(i) = log3n . n . 1
i=3 j=1 i=1
<< "\n";
n n 1
counter++; ∑ f(i) . ∑ f(i) . ∑ f(i) = log3n . n . 1
i=3 j=1 i=1
n n 1
j++; ∑ f(i) . ∑ f(i) . ∑ f(i) = log3n . n . 1
i=3 j=1 i=1
Determine the number of steps
- solution
Total steps:

=> 1 + 1+ 1 + log3n + log3n . n + log3n . n . 1 + log3n . n . 1 + log3n . n . 1

=> 3 + log3n + log3n . n + log3n . n + log3n . n + log3n . n

=> 3 + log3n + 4n log3n


Determine the number of steps :
solution
3 + log3n + 4n log3n
• Consider the largest factor
(4n log3n)
• and remove the coefficient
(n log3n)
• In asymptotic classification, the base of the log can be
omitted as shown in this formula:
logan = logbn / logba
• Thus, log3n = log2n / log23 = log2n / 1.58…
• Remove the coefficient 1/1.58..
• So we get the complexity time of the algorithm is
O(n log2n)
Determine the number of steps
Conclusion and Summary
• Algorithm analysis to study the efficiency of
algorithms when the input size grow, based on
the number of steps, the amount of computer
time and space
• Can be done using Big O notation by using growth
of function.
• Order of growth for some common function:
O(1) < O(logxn) < O(n) < O(n log2n) < O(n2) < O(n3) < O(2n)
• Three possible states in algorithm analysis best
case, average case and worst case.
40

You might also like