Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
6 views

L1 AlgorithmAnalysis

This document discusses analyzing the time complexity of algorithms. It explains that algorithms can be compared based on their growth rates as the problem size increases. Common time complexities like constant, linear, quadratic, and cubic time are introduced. Big O notation is used to describe the asymptotic upper bound of an algorithm's growth rate.

Uploaded by

can
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views

L1 AlgorithmAnalysis

This document discusses analyzing the time complexity of algorithms. It explains that algorithms can be compared based on their growth rates as the problem size increases. Common time complexities like constant, linear, quadratic, and cubic time are introduced. Big O notation is used to describe the asymptotic upper bound of an algorithm's growth rate.

Uploaded by

can
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 41

Analysis of Algorithms

Initially prepared by Dr. İlyas Çiçekli; improved by various Bilkent CS202 instructors.

CS202 - Fundamentals of Computer Science II 1


Algorithm
• An algorithm is a set of instructions to be followed to solve a problem.
– There can be more than one solution (more than one algorithm) to solve a given problem.
– An algorithm can be implemented using different prog. languages on different platforms.

• Once we have a correct algorithm for the problem, we have to


determine the efficiency of that algorithm.
– How much time that algorithm requires.
– How much space that algorithm requires.

• We will focus on
– How to estimate the time required for an algorithm
– How to reduce the time required

CS202 - Fundamentals of Computer Science II 2


Analysis of Algorithms
• How do we compare the time efficiency of two algorithms that solve
the same problem?

• We should employ mathematical techniques that analyze algorithms


independently of specific implementations, computers, or data.

• To analyze algorithms:
– First, we start counting the number of significant operations in a
particular solution to assess its efficiency.
– Then, we will express the efficiency of algorithms using growth
functions.

CS202 - Fundamentals of Computer Science II 3


Analysis of Algorithms
• Simple instructions (+,-,*,/,=,if,call) take 1 step

• Loops and subroutine calls are not simple operations

– They depend on size of data and the subroutine

– “sort” is not a single step operation

– Complex Operations (matrix addition, array resizing) are not single step

• We assume infinite memory

• We do not include the time required to read the input


CS202 - Fundamentals of Computer Science II 4
The Execution Time of Algorithms
Consecutive statements
Times
count = count + 1; 1
sum = sum + count; 1

Total cost = 1 + 1
🡺 The time required for this algorithm is
constant

Don’t forget: We assume that each simple operation takes one unit of time

CS202 - Fundamentals of Computer Science II 5


The Execution Time of Algorithms
If-else statements
Times
if (n < 0){ 1
absval = -n 1
cout << absval; 1

}
else
absval = n; 1

Total Cost <= 1 + max(2,1)


CS202 - Fundamentals of Computer Science II 6
The Execution Time of Algorithms
Single loop statements
Times
i = 1; 1
sum = 0; 1
while (i <= n) { n+1
i = i + 1; n
sum = sum + i; n
}

Total cost = 1 + 1 + (n + 1) + n + n
🡺 The time required for this algorithm is proportional to n

CS202 - Fundamentals of Computer Science II 7


The Execution Time of Algorithms
Nested loop statements
Times
i = 1; 1
sum = 0; 1
while (i <= n) { n+1
j=1; n
while (j <= n) { n * (n+1)
sum = sum + i; n*n
j = j + 1; n*n
}
i = i +1; n
}

Total cost = 1 + 1 + (n + 1) + n + n * (n + 1) + n * n + n * n + n
🡺 The time required for this algorithm is proportional to n2
CS202 - Fundamentals of Computer Science II 8
Algorithm Growth Rates
• We measure the time requirement of an algorithm as a function of the
problem size.
• The most important thing is to learn how quickly the time requirement
of an algorithm grows as a function of the problem size.
• An algorithm’s proportional time requirement is known as growth rate.
• We can compare the efficiency of
two algorithms by comparing
their growth rates.

The time requirement as a


function of the problem size n

CS202 - Fundamentals of Computer Science II 9


Order-of-Magnitude Analysis and
Big-O Notation

• If Algorithm A requires time at most proportional to f(n), it


is said to be order f(n), and it is denoted as O(f(n))

• f(n) is called the algorithm’s growth-rate function

• Since the capital O is used in the notation,


this notation is called the Big-O notation

CS202 - Fundamentals of Computer Science II 10


Big-O Notation
Definition:

• Algorithm A is order of if it requires no more than


time units to solve a problem of size
– There may exist many values of c and n0

• More informally, is an upper bound on

CS202 - Fundamentals of Computer Science II 11


O-notation: Asymptotic upper bound

T(n) = O(f(n)) if ∃ positive constants c, n0 such that


0 ≤ T(n) ≤ cf(n), ∀n ≥ n0
c.f(n)
Asymptotic running times of
T(n) = O(f(n))
algorithms are usually
defined by functions whose
T(n) domain are N={0, 1, 2, …}
(natural numbers)

CS 202 12
Big-O Notation

c*f(n) c1*f(n)

T(n) T(n)
T(n) c*f(n) c2*f(n)

O(f(n)) Ω(f(n)) Θ(f(n))

• Big-O definition implies: constant n0 beyond which it is satisfied

• We do not care about small values of n

CS202 - Fundamentals of Computer Science II 13


Example

Show that 2n2 = O(n3)

We need to find two positive constants: c and n0 such that:


0 ≤ 2n2 ≤ cn3 for all n ≥ n0

Choose c = 2 and n0 = 1
🡺 2n2 ≤ 2n3 for all n ≥ 1

Or, choose c = 1 and n0 = 2


🡺 2n2 ≤ n3 for all n ≥ 2

CS 202 14
Example

Show that 2n2 + n = O(n2)

We need to find two positive constants: c and n0 such that:


0 ≤ 2n2 + n ≤ cn2 for all n ≥ n0

2 + (1/n) ≤ c for all n ≥ n0

Choose c = 3 and n0 = 1

🡺 2n2 + n ≤ 3n2 for all n ≥ 1

CS 202 15
Example
• Show that f(n) = n2 - 3n + 10 is order of O(n2)
– Show that there exist constants c and n0 that satisfy the condition

Try c = 3 and n0 = 2

CS202 - Fundamentals of Computer Science II 16


True or False?

True Choose c = 109 and n0 = 1


109n2 = O (n2)
0 ≤ 109n2 ≤ 109n2 for n ≥1

Choose c = 100 and n0 = 1


1.9999 2
100n = O (n ) True
0 ≤ 100n1.9999 ≤ 100n2 for n≥1

10-9n2.0001 ≤ cn2 for n ≥ n0


10-9n2.0001 = O (n2) False
10-9 n0.0001 ≤ c for n ≥ n0
Contradiction

CS 202 17
A Comparison of Growth-Rate Functions

CS202 - Fundamentals of Computer Science II 18


A Comparison of Growth-Rate Functions

CS202 - Fundamentals of Computer Science II 19


A Comparison of Growth-Rate Functions
• Any algorithm with n! complexity is useless for n>=20

• Algorithms with 2n running time is impractical for n>=40

• Algorithms with n2 running time is usable up to n=10,000


– But not useful for n>1,000,000

• Linear time (n) and n log n algorithms remain practical even for one
billion items

• Algorithms with log n complexity is practical for any value of n

CS202 - Fundamentals of Computer Science II 20


Properties of Growth-Rate Functions
1. We can ignore the low-order terms
– If an algorithm is O(n3+4n2+3n), it is also O(n3)
– Use only the highest-order term to determine its grow rate

2. We can ignore a multiplicative constant in the highest-order term


– If an algorithm is O(5n3), it is also O(n3)

3. O( f(n) ) + O( g(n) ) = O( f(n) + g(n) )


– If an algorithm is O(n3) + O(4n2), it is also O(n3 +4n2) 🡺 So, it is O(n3)
– Similar rules hold for multiplication

CS202 - Fundamentals of Computer Science II 21


Some Useful Mathematical Equalities

CS202 - Fundamentals of Computer Science II 22


Growth-Rate Functions
Remember our previous examples
Times
i = 1; 1
sum = 0; 1
while (i <= n) { n+1
i = i + 1; n
sum = sum + i; n
}
Total cost = 1 + 1 + (n + 1) + n + n = 3 * n + 3
🡺 The time required for this algorithm is proportional to n
🡺 The growth-rate of this algorithm is proportional to O(n)

CS202 - Fundamentals of Computer Science II 23


Growth-Rate Functions
Times
i = 1; 1
sum = 0; 1
while (i <= n) { n+1
j=1; n
while (j <= n) { n * (n + 1)
sum = sum + i; n*n
j = j + 1; n*n
}
i = i +1; n
}
Total cost = 1 + 1 + (n + 1) + n + n * (n + 1) + n * n + n * n + n
Total cost = 3 * n2 + 4 * n + 3
🡺 The time required for this algorithm is proportional to n2
🡺 The growth-rate of this algorithm is proportional to O(n2)
CS202 - Fundamentals of Computer Science II 24
What to Analyze
• Worst-case performance
– It is an upper bound for any input
– Its use is more common than the others

• Best-case performance
– This is useless! Why?

• Average-case performance
– It is valid if you can figure out what the “average” input is
– It is computed considering all possible inputs and their distribution
– It is usually difficult to compute

CS202 - Fundamentals of Computer Science II 25


Consider the sequential search algorithm

int sequentialSearch(const int a[], int item, int n){


for (int i = 0; i < n; i++)
if (a[i] == item)
return i;
return -1;
}

Worst-case:
– If the item is in the last location of the array; or
– If it is not found in the array
Best-case:
– If the item is in the first location of the array
Average-case:
– How can we compute it? (assume each element is unique)
CS202 - Fundamentals of Computer Science II 26
How to find the growth-rate of C++ codes?

CS202 - Fundamentals of Computer Science II 27


Some Examples

CS202 - Fundamentals of Computer Science II 28


for (int i = 0; i < n; i++)
{
printf("%d", i);
}

29
for (int i = 0; i < n; i++)
{
for (int j = 0; j < n; j++)
{
printf("%d - %d\n", i, j);
}
}

30
for (int i = 0; i < n; i++)
{
printf("%d\n", i);
}

for (int i = 0; i < n; i++)


{
printf("%d\n", i);
}

31
printf("Hello\n");

for (int i = 0; i < n/2; i++)


{
printf("%d\n", i);
}

for (int i = 0; i < 100; i++)


{
printf("Hi\n");
}

32
for (int i = 0; i < n; i++)
{
printf("%d\n", i);
}

for (int i = 0; i < n; i++)


{
for (int j = 0; j < n; j++)
{
printf("%d\n", i + j);
}
}

33
for (int i = 0; i < n; i++)
{
printf("%d\n", i); int foo(int i, int j, int n)
} {
for (int k=0; k<n; k++)
for (int i = 0; i < n; i++) {
{ printf(“%f\n”, (i+j)/n);
for (int j = 0; j < n; j++) }
{ }
printf("%d\n", foo(i, j, n));
}
}

34
bool arrayContainsElement(int arr[], int n, int element)
{
for (int i = 0; i < n; i++)
{
if (arr[i] == element) return true;
}
return false;
}

35
What about recursive functions?

CS202 - Fundamentals of Computer Science II 36


Consider the problem of Hanoi towers

void hanoi(int n, char source, char dest, char spare) {


if (n > 0) {
hanoi(n - 1, source, spare, dest);
move from source to dest
hanoi(n - 1, spare, dest, source);
}
} http://www.cut-the-knot.org/recurrence/hanoi.shtml

How do we find the growth-rate of the recursive hanoi function?


• First write a recurrence equation for the hanoi function
• Then solve the recurrence equation
– There are many methods to solve recurrence equations
• We will learn a simple one known as repeated substitutions

CS202 - Fundamentals of Computer Science II 37


Let’s first write a recurrence
equation for the hanoi
function

We will then solve it by using


repeated substitutions

...
...

CS202 - Fundamentals of Computer Science II 38


More examples
• Factorial function
• Binary search
• Merge sort – later

CS202 - Fundamentals of Computer Science II 39


Factorial
int factorial(int n)
{
//Base Case
if(n == 1)
return 1;

return n *factorial(n-1);
}

40
Binary Search
int binarySearch( int a[], int left, int right, int x) {
int mid; // mid will be the index of
// target when it’s found.
if (right >= 1){
mid = left + (right-left)/2;
if (a[mid] == x)
return mid;
if (a[mid] > x)
return binarySearch(a, left, mid-1, x);
return binarySearch(a, mid+1, right, x);
}

return -1;
}

CS202 - Fundamentals of Computer Science II 41

You might also like