Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Asymptotic Analysis

Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 35

Asymptotic Notations

Amjad Ali
Assistant Professor
Department of Computer Science
School of Systems and Technology
University of Management and Technology

Email: amjad.ali@umt.edu.pk
Asymptotic Notations
How do we Compare Algorithms?
(1) Compare execution times?
Not good: times are specific to a particular computer !!

(2) Count the number of statements executed?


Not good: number of statements vary with the programming
language as well as the style of the individual
programmer.

(3) Ideal Solution


o Express running time as a function of the input size n (i.e., f(n)).

o Compare different functions corresponding to running times.

o Such an analysis is independent of machine time, programming style,


Asymptotic Notations

When we study algorithms, we are usually interested in

• order of growth of the running time of an algorithm


• not in the exact running time.

We need to develop a way to talk about rate of growth of functions so


that we can compare algorithms.

Asymptotic notation gives us a method for classifying functions


according to their rate of growth.
Asymptotic Notation
Least Terms and coefficients
Suppose, the time (or the number of steps) it takes to complete a problem of
size n might be found to be

T(n) = 4 n2 + 50 n + 2.

1. As n grows large, the n2 term will come to dominate, so that all other terms

can be neglected.
- for n = 50000, 4 n2 is 4000 times as large as the 50 n.

2. The coefficients become irrelevant if we compare to any other order of


expression, such as an expression containing a term n 3.

if T(n) = 1,000,000 n2 and U(n) = n3,


U(n) will always exceed the T(n) once n grows larger than 1,000,000
Asymptotic Notation
Least Terms and coefficients

T(n) = 10 n2 U(n) = n3

n 10 n2 n3

1 10 1
10 1000 1000
100 100000 1000000
1000 100000000 1000000000
10000 10000000000 1000000000000
Asymptotic Notation
Least Terms and coefficients

3. Additionally, the number of steps depends on the details of the machine


model on which the algorithm runs, but different types of machines
typically vary by only a constant factor in the number of steps needed to
execute an algorithm.

The developers are interested in finding a function T(n) that will express how
long the algorithm will take to run in terms of the number of elements in the
input set.
Asymptotic Notation

Criteria
1. Capture behavior when n0 ≤ n → ∞.
2. Functions that differ by at most a constant multiplier are considered

equivalent.

In one machine, an algorithm may take T(n) = 15 n3 + n2 + 4


On another, an algorithm may take T(n) = 5 n3 + 4 n + 5

Both will belong to the same class of functions. Namely, “cubic functions of
n". Function classes can be thought of at “how many times we do something
to each input”.
Big O
Big O
Definition:
f(n) ε O(g(n) iff there are two positive constants c and n 0
such that
0 ≤ f(n) ≤ c g(n) for all n ≥ n0
Big O
Definition:
f(n) ε O(g(n) iff there are two positive constants c and n 0
such that
0 ≤ f(n) ≤ c g(n) for all n ≥ n0

• We say that
f(n) is big-O of g(n)
Set of all functions whose rate of growth is the same as or lower than that of g(n).

• As n increases, f(n) grows no faster than g(n).


In other words, g(n) is an asymptotic upper bound on f(n).
f(n) ε O(g(n) iff there are two positive
Example # 1 - Big O constants c and n0 such that
n2 + n = O(n3) 0 ≤ f(n) ≤ c g(n) for all n ≥ n0

f(n) = n2 + n
g(n) = n3 Notice that n ≤ n3 holds for n ≥ 1
n2 ≤ n3 holds for n ≥ 1
Therefore, In general, na ≤ nb holds for a ≤ b
n2 + n ≤ n3 + n3
n2 + n ≤ 2 n3
n2 + n ≤ 2 n3 holds for all n ≥ n0 , with n0 = 1 and c =2

=> n2 + n = O(n3)

Note: whenever n ≥ 1. This fact is used often in these types of situations.


Big O
Strategies for Big-O
Sometimes the easiest way to prove that
f(n) = O(g(n))
is to take c to be the sum of the positive coefficients of f(n).
(We can usually ignore the negative
coefficients.)

f(n) = n2 + n n2 + n ≤ c n3 (taking c=2 as sum of coefficients 1+1=2)


g(n) = n3

Good Strategy:
A good strategy is to pick a value of c which you
think will
Example # 2 - Big O
n2 + 2n + 1 = O(n2)
f(n) = n2 + 2n + 1
g(n) = n2 As n ≤ n2 holds for n ≥ 1
1 ≤ n2 holds for n ≥ 1
n2 + 2n + 1 ≤ n2 + 2n2 + n2
n2 + 2n + 1 ≤ 4 n2 holds for all n ≥ n0 , with n0 = 1 and c =4

n2 + 2n + 1 = O(n2)
OR As 2n ≤ n2 holds for n ≥ 2
1 ≤ n2 holds for n ≥ 1
n2 + 2n + 1 ≤ n2 + n2 + n2
n2 + 2n + 1 ≤ 3 n2 holds for all n ≥ n0 , with n0 = 2 and c =3

n2 + 2n + 1 = O(n2)
Big O
Observe that in the relationship,
f(n) is O(n2)
n2 can be replaced by any function with larger values than n 2 , for example
f(n) is O(n3)
f(n) is O(n4)
and so on.
Note: if c and n0 are one pair of witnesses, then any pair c’ and n0’ where
c < c’ and n0 < n0’ is also a pair of witnesses because
f(n) ≤ c g(n) ≤ c’ g(n) whenever n > n0’ > n0
 f(n) is O g(n) or f(n) ε O g(n) is sometimes written
f(n) = O g(n)
 Here, equal sighn does not mean equality, rather, this notation tells us that an
inequality holds relating the values of the functions f and g for sufficiently large
numbers in the domains of these functions.
 It is acceptable to write
f(n) ε O g(n)
Because O g(n) represents the set of functions that are O g(n)
Example # 3 - Big O
Notice that 7 n2 ≤ n3
7n2 = O(n3) 7 ≤ 1 ----- false for n = 1
28 ≤ 8 ----- false for n = 2
f(n) = 7 n2 63 ≤ 27 ----- false for n = 3
----------------------
g(n) = n3 252 ≤ 216 ----- false for n = 6
343 ≤ 343 ----- true for n = 7

Therefore,
7 n2 ≤ 1 . n3 holds for all n ≥ n0 , with n0 = 7 and c = 1
 7 n2 = O(n3)

OR
7 n2 ≤ 7 . n3 holds for all n ≥ n0 , with n0 = 1 and c = 7
(c=7 and n0 = 1 are also witnesses to the relationship)

=> 7 n2 = O(n3)
Example # 4 - Big O
n2 ≠ O(n)
f(n) = n2
g(n) = n

We must show that no pair of constants c and n0 exist such that n2 ≤ c n


whenever n > n0.

n2 ≤ c n
n ≤ c (dividing by n both sides)

 No matter what c and n0 are, the inequality n ≤ c cannot


hold for all n with n ≥ n0.

 Once we set a value of n0 we see that when n is larger


than the maximum value of n0 and c, it is not true that n ≤ c
Example # 5 - Big O
n3 ≠ O(7n2)
f(n) = n3
g(n) = 7 n2

n3 ≤ c (7 n2)
n ≤ 7c (dividing by n both sides)

 No c exists for which n ≤ 7 c for all n ≥ n 0 , no matter what n0 is,


because
n can be made arbitrarily large.

 No witnesses (no pair of constants c and n0) exists here. Hence

n3 is not O(7n2)
Example # 6 - Big O Taking n = 4
1+2+3+4 ≤ 4+4+4+4
Estimate the sum of first n positive integers. 10 ≤ 16

Taking n = 5
1+2+3+4+5 ≤ 5+5+5+5+5
1+2+3+4+....+n ? 15 ≤ 25

As 1 ≤ n
Therefore,
2≤ n
1+2+3+4+....+n ≤ n+n+n+.....+n 3≤ n
≤ n.n
≤ n2
1 + 2 + 3 + 4 + . . . . + n ≤ 1 . n2 holds for all n ≥ n0 , with n0 = 1 and c =1

=> 1 + 2 + 3 + 4 + . . . . + n is O(n2)
Example # 7 - Big O Taking n = 4
1.2.3.4≤ 4.4.4.4
Estimate the factorial function f(n) = n ! 24 ≤ 256

Taking n = 3
1.2.3 ≤ 3.3.3
n! = 1 . 2 . 3 . 4 . 5 ……. . n ? 6 ≤ 27

Therefore,
As n . n = n2
1.2.3.4. .... .n ≤ n.n.n. ..... .n
n . n . n = n3
≤ nn n.n.n … =
nn
1 . 2 . 3 . 4 . . . . . . n ≤ 1 . nn holds for all n ≥ n0 , with n0 = 1 and c = 1

=> 1 . 2 . 3 . 4 . . . . . . n = O(nn)
Example # 8 - Big O
Estimate the logarithm of factorial function f(n) = log n !

As we know from previous example,


1 . 2 . 3 . 4 . . . . . . n ≤ nn
Or
n ! ≤ nn
log n ! ≤ log nn taking log on both sides
log n ! ≤ n log n
log n ! ≤ 1 . n log n holds for all n ≥ n0, with n0 = 1 and c = 1

=> log n ! = O(n log n)


 - Omega
Big 
Definition:
f(n) ε  (g(n) iff there are two positive constants c and n 0

such that
0  cg(n)  f(n) for all n ≥ n0
f(n) ε  (g(n) iff there are two
Example # 1 - Big  positive constants c and n0 such that
n3 + 4 n2 =  (n2) 0  c g(n)  f(n) for all n ≥ n0
f(n) = n3 + 4 n2
g(n) = n2
As n2 ≤ n 3
and n3 ≤ n3 + 4n2
Then n2 ≤ n3 ≤ n3 + 4n2
Therefore,
n2 ≤ n3 ≤ n3 + 4n2
n2 ≤ n3 + 4n2 or
n3 + 4n2 ≥ 1 . n2 holds for all n ≥ n0 , with n0 = 1 and c = 1
Therefore,
n3 + 4 n2 =  (n2) with n0 = 1 and c =1
Big 
Definition:
f(n) ε  (g(n) iff there are two positive constants c and n 0

such that
0  cg(n)  f(n) for all n ≥ n0
• We say that
f(n) is Omega of g(n)
Set of all functions whose rate of growth is the same as or higher than that of g(n).

• As n increases, f(n) grows no slower than g(n).


In other words, g(n) is an asymptotic lower bound of f(n).
As 1+2+3+4+5+6 ≥ 3+4+5+6
Example # 2 - Big  (OR 1+2+3+4+5 ≥ 3+4+5) - taking floor

1+2+3+4+....+n ? Taking n = 6
1+2+3+4+5+6 ≥ 3+4+5+6

f(n) = 1 + 2 + 3 + 4 + . . . . + n ≥ 3+3+3+3
≥ (1+1+1+1) 3
≥ (n-(n/2)+1) n/2
Therefore, n=6 , => n/2 = 3

1 + 2 + 3 + 4 + . . . . + n ≥ n/2 + (n/2 + 1) + (n/2 + 2) + . . . + n


≥ n/2 + n/2 + n/2 + . . . + n/2
≥ ( n - n/2 + 1 ) ( n/2 )
≥ ( n/2 + 1 ) ( n/2 )
≥ ( n/2 ) ( n/2 )
≥ ( n2/4 )
≥ 1/4 n2
1 + 2 + 3 + 4 + . . . . + n ≥ 1/4 n2 holds for all n ≥ n0 , with n0=1 and c=1/4

Therefore, 1 + 2 + 3 + 4 + . . . . + n =  (n2) with n0 = 1 and c =1/4


Big 

Strategies for 
Quite often, we have to pick c < 1.

– A good strategy is to pick a value of c which you think will work, and
determine which value of n0 is needed.
– Being able to do a little algebra helps.
– We can sometimes simplify by ignoring terms if f(n) with the positive
coefficients. Why?
 - Big Theta
Big 
Definition:
f(n) ε  (g(n) iff there are two positive constants c1, c2, and n0

such that
0  c1 g(n)  f(n)  c2 g(n) for all n ≥ n0
Example # 1 - Big  f(n) ε  (g(n) iff there are two positive
constants c1, c2, and n0 such that
n2 + 5 n + 7 =  (n2)
0  c1 g(n)  f(n)  c2 g(n) for all n ≥
f(n) = n + 5 n + 7
2
n0
g(n) = n2 Notice that n ≤ n2 holds for n ≥ 1
1 ≤ n2 holds for n ≥ 1
n2 + 5 n + 7 ≤ n2 + 5 n2 + 7 n2
n2 + 5 n + 7 ≤ 13 n2 , holds when n ≥ n0 with n0 = 1 and c2 = 13
f(n) = O(g(n)) i.e., n2 + 5 n + 7 = O(n2)
and
n2 + 5 n + 7 ≥ n2 , holds when n ≥ n0 with n0 = 0 and c1 = 1
f(n) =  (g(n)) i.e., n2 + 5 n + 7 =  (n2)

 n2 ≤ f(n) ≤ 13 n2 holds for n ≥ n0 , with n0 = 1 , c1 = 1 and c2 = 13


Hence n2 + 5 n + 7 =  (n2)
Note: f(n) =  ( g(n) ) if and only if
f(n) = O(g(n)) and f(n) =  (g(n)).
Big 
Definition:
f(n) ε  (g(n) iff there are two positive constants c1, c2, and n0

such that
0  c1 g(n)  f(n)  c2 g(n) for all n ≥ n0
• We say that
f(n) is Theta of g(n)
Set of all functions that have the same rate of growth as g(n).

• As n increases, f(n) grows at the same rate as g(n).


In other words, g(n) is an asymptotic tight bound on f(n).
Example # 2 - Big 
½ n2 + 3 n =  (n2)
f(n) = ½ n2 + 3 n
g(n) = n2
½ n2 + 3 n ≤ ½ n2 + 3 n2
½ n2 + 3 n ≤ 7/2 n2 , holds when n ≥ n0 , with n0 =1 and c2 = 7/2
f(n) = O(g(n)) i.e., ½ n2 + 3 n = O ( n2 )
and
½ n2 ≤ ½ n2 + 3 n ,
½ n2 + 3 n ≥ ½ n2 , holds when n ≥ n0 , with n0 =0 and c1 = ½
f(n) =  (g(n)) i.e., ½ n2 + 3 n =  ( n2 )
=> ½ n2 ≤ ½ n2 + 3 n ≤ 7/2 n2 ,
holds when n ≥ n0 , with n0 =1 and c1 = ½ and c2 = 7/2

Thus ½ n2 + 3 n =  (n2)
Theorem
Let f(x) = an xn + an-1 xn-1 + an-2 xn-2 + an-3 xn-3 + . . . . . . . + a1 x1 + a0

an xn + an-1 xn-1 + . . . . . + a1 x1 + a0 ≤ an xn + an-1 xn + . . . . . + a1 xn + a0 xn


≤ (an + an-1 + . . . . . + a1 + a0 ) xn
holds for n ≥ n0 , with n0 =1 and c2 = (an + an-1 + . . . . . + a1 + a0 )
=>
f(n) = O ( xn ) or an xn + an-1 xn-1 + . . . . . + a1 x1 + a0 = O ( xn )
and
an xn + an-1 xn-1 + . . . . . + a1 x1 + a0 ≥ a n xn
holds for n ≥ n0 , with n0 =1 and c1 = an
=>
f(n) =  ( xn ) or an xn + an-1 xn-1 + . . . . . + a1 x1 + a0 =  ( xn )
As f(n) = O ( xn ) and f(n) =  ( xn )

Thus an xn + an-1 xn-1 + . . . . . + a1 x1 + a0 =  ( xn )


BIG-O ESTIMATES INVOLVING LOGARITHMS
Give a big-O estimate for
f (n) = 3n log(n!) + (n2 + 3) log n, where n is a positive integer.
Solution: Theorem 2:
Part 1 : 3n log(n!) Part 2: (n2 + 3) log n Let f1(n) = O(g1(n)) and f2(n)= O(g2(n)) Then
(f1 + f2)(n) = O (max ( |g1(n)|, |g2(n)| ))
Part 1: 3n log(n!)
3n = O(n) and Theorem 3:
log(n!) = O(n log n) Let f1(n) = O(g1(n)) and f2(n)= O(g2(n)) Then
(f1 . f2)(n) = O ( g1(n).g2(n) )
Theorem 3 gives: 3n log(n!) = O(n . n log n) = O(n2 log n)
Part 2: (n2 + 3) log n
(n2 + 3) < 2 n2 true with c = 2 and n0 = 3 and for all n ≥ n0
n2 + 3 = O(n2)
Theorem 3 gives
(n2 + 3) log n = O(n2 log n)

Theorem 2 gives:
3n log(n!) + (n2 + 3) log n = O(max (n2 log n, n2 log n)) = O(n2 log n)
BIG-O ESTIMATES INVOLVING LOGARITHMS

Give a big-O estimate for


f (n) = (n + 1) log(n2 + 1) + 3n2, where n is a positive integer.
Solution:
Part 1 : (n + 1) log(n2 + 1) Part 2: 3n2
Part 1: (n + 1) log(n2 + 1)
n+1 = O(n) and
n2 + 1 ≤ 2 n2 true with c = 2 and n0 = 1 and for all n ≥ n0
log(n2 + 1) ≤ log (2 n2)
log(n2 + 1) ≤ log 2 + log n2
log(n2 + 1) ≤ 3 log n true with c = 3 and n0 = 3 and for all n ≥ n0
log(n2 + 1) = O (log n)
Theorem 3 gives: (n + 1) log(n2 + 1) = O(n log n)
Part 2: 3n2
3n2 = O (n2)

Theorem 2 gives:
(n+1) log(n2+1)+3n2 =O(max(n log n, n2))=O(n2) (as n log n ≤ 1. n2 true with c=1 and n0 = 2)
(n + 1) log(n2 + 1) + 3n2 = O(n2)
BIG-O ESTIMATES INVOLVING LOGARITHMS

Show that 3n2 + 8n log n is  (n2).

Solution:

8 n log n ≤ 8 n2
it follows that
3n2 + 8n log n ≤ 11n2 for n > 1.

3n2 + 8n log n = O(n2)


Clearly,
n2 = O(3n2 + 8n log n)
Consequently,
3n2 + 8n log n =  (n2).

You might also like