Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

AAA-5

Download as pdf or txt
Download as pdf or txt
You are on page 1of 35

Time Complexity of Algorithms

(Asymptotic Notations)
What is Complexity?
• The level in difficulty in solving mathematically
posed problems as measured by
– The time
(time complexity)
– number of steps or arithmetic operations
(computational complexity)
– memory space required
– (space complexity)
Major Factors in Algorithms Design
1. Correctness
An algorithm is said to be correct if
• For every input, it halts with correct output.
• An incorrect algorithm might not halt at all OR
• It might halt with an answer other than desired one.
• Correct algorithm solves a computational problem
2. Algorithm Efficiency
Measuring efficiency of an algorithm,
• do its analysis i.e. growth rate.
• Compare efficiencies of different algorithms for the
same problem.
Algorithms Growth Rate
Algorithm Growth Rates
• It measures algorithm efficiency
What means by efficient?
▪ If running time is bounded by polynomial in the input
Notations for Asymptotic performance
• How running time increases with input size
• O, Omega, Theta, etc. for asymptotic running time
• These notations defined in terms of functions whose
domains are natural numbers
• convenient for worst case running time
• Algorithms, asymptotically efficient best choice
Complexity Analysis
• Algorithm analysis means predicting resources such as
– computational time
– memory
– computer hardware etc
• Worst case analysis
– Provides an upper bound on running time
– An absolute guarantee
• Average case analysis
– Provides the expected running time
– Very useful, but treat with care: what is “average”?
• Random (equally likely) inputs
• Real-life inputs
Worst-case Analysis
Let us suppose that
• Dn = set of inputs of size n for the problem
• I = an element of Dn.
• t(I) = number of basic operations performed on I
• Define a function W by
W(n) = max{t(I) | I  Dn}
called the worst-case complexity of the algorithm
• W(n) is the maximum number of basic operations
performed by the algorithm on any input of size n.
• Please note that the input, I, for which an algorithm
behaves worst depends on the particular algorithm.
Average Complexity
• Let Pr(I) be the probability that input I occurs.
• Then the average behavior of the algorithm is defined as
A(n) = Pr(I) t(I), summation over all I  Dn
• We determine t(I) by analyzing the algorithm, but Pr(I)
cannot be computed analytically.
• Average cost =
A(n) = Pr(succ)Asucc(n) + Pr(fail)Afail(n)
• An element I in Dn may be thought as a set or
equivalence class that affect the behavior of the algorithm
Worst Analysis computing average cost
• Take all possible inputs, compute their cost, take average
Asymptotic Notations Properties
• Categorize algorithms based on asymptotic growth
rate e.g. linear, quadratic, polynomial, exponential
• Ignore small constant and small inputs
• Estimate upper bound and lower bound on growth
rate of time complexity function
• Describe running time of algorithm as n grows to .
• Describes behavior of function within the limit.
Limitations
• not always useful for analysis on fixed-size inputs.
• All results are for sufficiently large inputs.
Asymptotic Notations
Asymptotic Notations , O, , o, 
▪ We use  to mean “order exactly”,
▪ O to mean “order at most”,
▪  to mean “order at least”,
▪ o to mean “tight upper bound”,
•  to mean “tight lower bound”,

Define a set of functions: which is in practice used to


compare two function sizes.
Big-Oh Notation (O)
If f, g: N → R+, then we can define Big-Oh as

For a given function g (n )  0, denoted by ( g (n )) the set of functions,


( g (n )) =  f (n ) : there exist positive constants c and no such that
0  f (n )  cg (n ), for all n n o 
f (n ) = (g (n )) means function g (n ) is an asymptotically
upper bound for f (n ).

We may write f(n) = O(g(n)) OR f(n)  O(g(n))


Intuitively:
Set of all functions whose rate of growth is the same as or lower
than that of g(n).
Big-Oh Notation

f(n)  O(g(n))

 c > 0,  n0  0 and n  n0, 0  f(n)  c.g(n)


g(n) is an asymptotic upper bound for f(n).
Examples
Examples
Example 1: Prove that 2n2  O(n3)
Proof:
Assume that f(n) = 2n2 , and g(n) = n3
f(n)  O(g(n)) ?
Now we have to find the existence of c and n0

f(n) ≤ c.g(n) → 2n2 ≤ c.n3 → 2 ≤ c.n


if we take, c = 1 and n0= 2 OR
c = 2 and n0= 1 then
2n2 ≤ c.n3
Hence f(n)  O(g(n)), c = 1 and n0= 2
Examples
Examples
Example 2: Prove that n2  O(n2)
Proof:
Assume that f(n) = n2 , and g(n) = n2
Now we have to show that f(n)  O(g(n))

Since
f(n) ≤ c.g(n) → n2 ≤ c.n2 → 1 ≤ c, take, c = 1, n0=
1

Then
n2 ≤ c.n2 for c = 1 and n  1
Hence, 2n2  O(n2), where c = 1 and n0= 1
Examples
Examples
Example 3: Prove that 1000.n2 + 1000.n  O(n2)
Proof:
Assume that f(n) = 1000.n2 + 1000.n, and g(n) = n2
We have to find existence of c and n0 such that
0 ≤ f(n) ≤ c.g(n) n  n0
1000.n2 + 1000.n ≤ c.n2 = 1001.n2, for c = 1001
1000.n2 + 1000.n ≤ 1001.n2
1000.n ≤ n2 n2  1000.n n2 - 1000.n  0
n (n-1000)  0, this true for n  1000
f(n) ≤ c.g(n) n  n0 and c = 1001

Hence f(n)  O(g(n)) for c = 1001 and n0 = 1000


Examples
Examples

Example 4: Prove that n3 O(n2)


Proof:
On contrary we assume that there exist some
positive constants c and n0 such that
0 ≤ n3 ≤ c.n2 n  n0
0 ≤ n3 ≤ c.n2 n ≤ c
Since c is any fixed number and n is any arbitrary
constant, therefore n ≤ c is not possible in general.
Hence our supposition is wrong and n3 ≤ c.n2,
n  n0 is not true for any combination of c and n0.
And hence, n3 O(n2)
SomeExamples
More Examples
1. n2 + n3 = O(n4)
2. n2 / log(n) = O(n . log n)
3. 5n + log(n) = O(n)
4. nlog n = O(n100)
5. 3n = O(2n . n100)
6. n! = O(3n)
7. n +1 = O(n)
8. 2n+1 = O(2n)
9. (n+1)! = O(n!)
10. 1 + c + c2 +…+ cn = O(cn) for c > 1
11. 1 + c + c2 +…+ cn = O(1) for c < 1
Big-Omega Notation ()
If f, g: N → R+, then we can define Big-Omega as
For a given function g (n ) denote by ( g (n )) the set of functions,
(g (n )) =  f (n ) : there exist positive constants c and no such that
0  cg (n )  f (n ) for all n n o 
f (n ) = (g (n )), means that function g (n ) is an asymptotically
lower bound for f (n ).

We may write f(n) = (g(n)) OR f(n)  (g(n))

Intuitively:
Set of all functions whose rate of growth is the same as or higher
than that of g(n).
Big-Omega Notation

f(n)  (g(n))

 c > 0,  n0  0 , n  n0, f(n)  c.g(n)


g(n) is an asymptotically lower bound for f(n).
Examples
Examples
Example 1: Prove that 5.n2  (n)
Proof:
Assume that f(n) = 5.n2 , and g(n) = n
f(n)  (g(n)) ?
We have to find the existence of c and n0 s.t.
c.g(n) ≤ f(n) for all n  n0
c.n ≤ 5.n2  c ≤ 5.n
if we take, c = 5 and n0= 1 then
c.n ≤ 5.n2 for all n  n0
And hence f(n)  (g(n)), for c = 5 and n0= 1
Examples
Examples
Example 2: Prove that 5.n + 10  (n)
Proof:
Assume that f(n) = 5.n + 10, and g(n) = n
f(n)  (g(n)) ?
We have to find the existence of c and n0 s.t.
c.g(n) ≤ f(n) for all n  n0
c.n ≤ 5.n + 10c.n ≤ 5.n + 10.n  c ≤ 15.n
if we take, c = 15 and n0= 1 then
c.n ≤ 5.n + 10 for all n  n0
And hence f(n)  (g(n)), for c = 15 and n0= 1
Examples
Examples
Example 3: Prove that 100.n + 5  (n2)
Proof:
Let f(n) = 100.n + 5, and g(n) = n2
Assume that f(n)  (g(n)) ?
Now if f(n)  (g(n)) then there exist c and n0 s.t.
c.g(n) ≤ f(n) for all n  n0 
c.n2 ≤ 100.n + 5  c.n ≤ 100 + 5/n 
n ≤ 100/c, for a very large n, which is not possible

And hence f(n)  (g(n))


Theta Notation ()

If f, g: N → R+, then we can define Big-Theta as

For a given function g (n ) denoted by ( g (n )) the set of functions,


( g (n )) =  f (n ) : there exist positive constants c1 , c2 and no such that
0  c1 g (n )  f (n )  c2 g (n ) for all n n o 
f (n ) = (g (n )) means function f (n ) is equal to g (n ) to within a constant
factor, and g (n ) is an asymptotically tight bound for f (n ).

We may write f(n) = (g(n)) OR f(n)  (g(n))

Intuitively: Set of all functions that have same rate of growth as g(n).
Theta Notation

f(n)  (g(n))

 c1> 0, c2> 0,  n0  0,  n  n0, c2.g(n)  f(n)  c1.g(n)


We say that g(n) is an asymptotically tight bound for f(n).
Theta Notation
Example 1: Prove that ½.n2 – ½.n = (n2)
Proof 1 1
Prove that .n – .n = (n ) 2 2

2 2

Assume that f(n)Assume = ½.n2 – ½.n, and g(n) = n2


f(n)  (g(n))?
f (n) = ½.n – ½.n 2

g ( n) = n 2

c .g ( n )  f ( n )  c .g ( n ) n  n
We have to findf (then )  c .gexistence
1

(n) 2
2
of c1, c2 and n0 s.t.
o

for all n  n0
½.n – ½.n  ½.n -½.n. ½.n  ½.n − 1/ 4.n = −1 / 2n
c1.g(n) ≤ f(n) ≤ c2.g(n)
2 2 2 2 2

Since, ½ n2 - ½ n ≤ ½ n2 n ≥ 0 if c2= ½ and


½ n2 - ½ n ≥ ½ n2 - ½ n . ½ n ( n ≥ 2 ) = ¼ n2, c1= ¼
Hence ½ n2 - ½ n ≤ ½ n2 ≤ ½ n2 - ½ n
c1.g(n) ≤ f(n) ≤ c2.g(n) n ≥ 2, c1= ¼, c2 = ½
Hence f(n)  (g(n))  ½.n2 – ½.n = (n2)
Theta Notation
Example 2: Prove that a.n2 + b.n + c = (n2) where a, b, c
are constants and a > 0
Proof
If we take c1 = ¼.a, c2 = 7/4. a and

n0 = 2. max(( b / a), ( c / a))


Then it can be easily verified that
0 ≤ c1.g(n) ≤ f(n) ≤ c2.g(n),n ≥ n0, c1= ¼.a, c2 = 7/4.a
Hence f(n)  (g(n))  a.n2 + b.n + c = (n2)
Hence any polynomial of degree 2 is of order (n2)
Theta Notation
Example 1: Prove that 2.n2 + 3.n + 6  (n3)
Proof: Let f(n) = 2.n2 + 3.n + 6, and g(n) = n3
we have to show that f(n)  (g(n))
On contrary assume that f(n)  (g(n)) i.e.
there exist some positive constants c1, c2 and n0
such that: c1.g(n) ≤ f(n) ≤ c2.g(n)
c1.g(n) ≤ f(n) ≤ c2.g(n)  c1.n3 ≤ 2.n2 + 3.n + 6 ≤ c2. n3 
c1.n ≤ 2 + 3/n + 6/n2 ≤ c2. n 
c1.n ≤ 2 ≤ c2. n, for large n 
n ≤ 2/c1 ≤ c2/c1.n which is not possible
Hence f(n)  (g(n))  2.n2 + 3.n + 6  (n3)
Little-Oh Notation
o-notation is used to denote a upper bound that is not
asymptotically tight.
For a given function g (n )  0, denoted by o( g (n )) the set of functions,
 f (n ) : for any positive constants c, there exists a constant no
o(g (n )) = 
such that 0  f (n )  cg (n ) for all n n o 

f(n) becomes insignificant relative to g(n) as n


f (n )
approaches infinity

( ) lim
e.g., 2n = o n but 2n  o n .. n → g (n )
2 2 2 = 0 ( )
g(n) is an upper bound for f(n), not asymptotically tight
Examples
Examples
Example 1: Prove that 2n2  o(n3)
Proof:
Assume that f(n) = 2n2 , and g(n) = n3
f(n)  o(g(n)) ?

Now we have to find the existence n0 for any c


f(n) < c.g(n) this is true
2n2 < c.n3  2 < c.n
This is true for any c, because for any arbitrary c
we can choose n0 such that the above inequality
holds.
Hence f(n)  o(g(n))
Examples
Examples

Example 2: Prove that n2  o(n2)


Proof:
Assume that f(n) = n2 , and g(n) = n2
Now we have to show that f(n)  o(g(n))

Since
f(n) < c.g(n) n2 < c.n2 1 ≤ c,

In our definition of small o, it was required to prove


for any c but here there is a constraint over c .
Hence, n2  o(n2), where c = 1 and n0= 1
Examples
Examples

Example 3: Prove that 1000.n2 + 1000.n  o(n2)


Proof:
Assume that f(n) = 1000.n2 + 1000.n, and g(n) = n2
we have to show that f(n)  o(g(n)) i.e.
We assume that for any c there exist n0 such that
0 ≤ f(n) < c.g(n) n  n0
1000.n2 + 1000.n < c.n2
If we take c = 2001, then,1000.n2 + 1000.n < 2001.n2
1000.n < 1001.n2 which is not true
Hence f(n)  o(g(n)) for c = 2001
Little-Omega Notation

Little- notation is used to denote a lower bound


that is not asymptotically tight.
For a given function g (n ), denote by  (g (n )) the set of all functions.
 (g (n )) =  f (n ) : for any positive constants c, there exists a constant no such that
0  cg (n )  f (n ) for all n n o 

f(n) becomes arbitrarily large relative to g(n) as n


f (n )
approaches infinity
lim =
n → g (n )

( )
2 2
=  (n ) but
n n
e.g.,   n ..
2

2 2
Examples
Examples
Example 1: Prove that 5.n2  (n)
Proof:
Assume that f(n) = 5.n2 , and g(n) = n
f(n)  (g(n)) ?
We have to prove that for any c there exists n0 s.t.,
c.g(n) < f(n) for all n  n0
c.n < 5.n2  c < 5.n
This is true for any c, because for any arbitrary c
e.g. c = 1000000, we can choose n0 = 1000000/5
= 200000 and the above inequality does hold.
And hence f(n)  (g(n)),
Examples
Examples

Example 2: Prove that 5.n + 10  (n)


Proof:
Assume that f(n) = 5.n + 10, and g(n) = n
f(n)  (g(n)) ?

We have to find the existence n0 for any c, s.t.


c.g(n) < f(n) n  n0
c.n < 5.n + 10, if we take c = 16 then
16.n < 5.n + 10  11.n < 10 is not true for any
positive integer.
Hence f(n)  (g(n))
Examples
Examples

Example 3: Prove that 100.n  (n2)


Proof:
Let f(n) = 100.n, and g(n) = n2
Assume that f(n)  (g(n))
Now if f(n)  (g(n)) then there n0 for any c s.t.
c.g(n) < f(n) n  n0 this is true
c.n2 < 100.n c.n < 100
If we take c = 100, n < 1, not possible
Hence f(n) (g(n)) i.e. 100.n  (n2)
Usefulness of Notations
• It is not always possible to determine behaviour of
an algorithm using Θ-notation.
• For example, given a problem with n inputs, we may
have an algorithm to solve it in a.n2 time when n is
even and c.n time when n is odd. OR
• We may prove that an algorithm never uses more
than e.n2 time and never less than f.n time.
• In either case we can neither claim (n) nor (n2) to
be the order of the time usage of the algorithm.
• Big O and  notation will allow us to give at least
partial information

You might also like