Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Lecture 2

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 60

Advanced Analysis of Algorithms

Asymptotic notation
Cormen (3.1)
Levetin ( 2.1, 2.2)
What is Complexity?
• The level in difficulty in solving mathematically posed
problems as measured by
 The time
(time complexity)
 number of steps or arithmetic operations
(computational complexity)
 memory space required
 (space complexity)
Major Factors in Algorithms Design
1. Correctness
An algorithm is said to be correct if
• For every input, it halts with correct output.
• An incorrect algorithm might not halt at all OR
• It might halt with an answer other than desired one.
• Correct algorithm solves a computational problem
2. Algorithm Efficiency
Measuring efficiency of an algorithm,
• do its analysis i.e. growth rate.
• Compare efficiencies of different algorithms for the
same problem.
Algorithms Growth Rate
Algorithm Growth Rates
• It measures algorithm efficiency
What means by efficient?
 If running time is bounded by polynomial in the input
Notations for Asymptotic performance
• How running time increases with input size
• O, Omega, Theta, etc. for asymptotic running time
• These notations defined in terms of functions whose
domains are natural numbers
• convenient for worst case running time
• Algorithms, asymptotically efficient best choice
Complexity Analysis
• Algorithm analysis means predicting resources such as
 computational time
 memory
 computer hardware etc
• Worst case analysis
 Provides an upper bound on running time
 An absolute guarantee
• Average case analysis
 Provides the expected running time
 Very useful, but treat with care: what is “average”?
o Random (equally likely) inputs
o Real-life inputs
Worst-case Analysis
Let us suppose that
• Dn = set of inputs of size n for the problem
• I = an element of Dn.
• t(I) = number of basic operations performed on I
• Define a function W by
W(n) = max{t(I) | I  Dn}
called the worst-case complexity of the algorithm
• W(n) is the maximum number of basic operations
performed by the algorithm on any input of size n.
• Please note that the input, I, for which an algorithm
behaves worst depends on the particular algorithm.
Average Complexity
• Let Pr(I) be the probability that input I occurs.
• Then the average behavior of the algorithm is defined as
A(n) = Pr(I) t(I), summation over all I 
Dn
• We determine t(I) by analyzing the algorithm, but Pr(I)
cannot be computed analytically.
• Average cost =
A(n) = Pr(succ)Asucc(n) + Pr(fail)Afail(n)
• An element I in Dn may be thought as a set or
equivalence class that affect the behavior of the algorithm
Worst Analysis computing average cost
• Take all possible inputs, compute their cost, take average
Asymptotic Notations Properties
• Categorize algorithms based on asymptotic growth
rate e.g. linear, quadratic, polynomial, exponential
• Ignore small constant and small inputs
• Estimate upper bound and lower bound on growth
rate of time complexity function
• Describe running time of algorithm as n grows to .
• Describes behavior of function within the limit.
Limitations
• not always useful for analysis on fixed-size inputs.
• All results are for sufficiently large inputs.
Asymptotic Notations
Asymptotic Notations , O, , o, 
 We use  to mean “order exactly”,
 O to mean “order at most”,
  to mean “order at least”,
 o to mean “tight upper bound”,
•  to mean “tight lower bound”,

Define a set of functions: which is in practice


used to compare two function sizes.
Big-Oh Notation (O)
If f, g: N  R+, then we can define Big-Oh as
For a given function g n  0, denoted by g n  the set of functions,
g n   f n : there exist positive constants c and no such that
0  f n  cg n , for all n n o 
f n  g n  means function g n is an asymptotic ally
upper bound for f n .

We may write f(n) = O(g(n)) OR f(n)  O(g(n))


Intuitively:
Set of all functions whose rate of growth is the same as or lower
than that of g(n).
Big-Oh Notation

f(n)  O(g(n))

 c > 0,  n0  0 and n  n0, 0  f(n)  c.g(n)


g(n) is an asymptotic upper bound for f(n).
Examples
Examples
Example 1: Prove that 2n2  O(n3)
Proof:
Assume that f(n) = 2n2 , and g(n) = n3
f(n)  O(g(n)) ?
Now we have to find the existence of c and
n0

f(n) ≤ c.g(n)  2n2 ≤ c.n3  2 ≤ c.n


if we take, c = 1 and n0= 2 OR
c = 2 and n0= 1 then
2n2 ≤ c.n3
Hence f(n)  O(g(n)), c = 1 and n = 2
Examples
Example 2: Prove that n2  O(n2)
Proof:
Assume that f(n) = n2 , and g(n) = n2
Now we have to show that f(n)  O(g(n))

Since
f(n) ≤ c.g(n)  n2 ≤ c.n2  1 ≤ c, take, c = 1,
n 0= 1

Then
n2 ≤ c.n2 for c = 1 and n  1
Hence, 2n2  O(n2), where c = 1 and n0= 1
Examples
Example 3: Prove that 1000.n2 + 1000.n  O(n2)
Proof:
Assume that f(n) = 1000.n2 + 1000.n, and g(n) =
n2
We have to find existence of c and n0 such that
0 ≤ f(n) ≤ c.g(n)  n  n0
1000.n2 + 1000.n ≤ c.n2 = 1001.n2, for c = 1001
1000.n2 + 1000.n ≤ 1001.n2
Û 1000.n ≤ n2  n2  1000.n  n2 - 1000.n  0
Û n (n-1000)  0, this true for n  1000
f(n) ≤ c.g(n)  n  n0 and c = 1001

Hence f(n)  O(g(n)) for c = 1001 and n0 =


Examples

Example 4: Prove that n3  O(n2)


Proof:
On contrary we assume that there exist some
positive constants c and n0 such that
0 ≤ n3 ≤ c.n2  n  n0
0 ≤ n3 ≤ c.n2  n ≤ c
Since c is any fixed number and n is any
arbitrary constant, therefore n ≤ c is not
possible in general.
Hence our supposition is wrong and n3 ≤ c.n2,
 n  n0 is not true for any combination of c
and n0. And hence, n3  O(n2)
Some More Examples
1. n2 + n3 = O(n4)
2. n2 / log(n) = O(n . log n)
3. 5n + log(n) = O(n)
4. nlog n = O(n100)
5. 3n = O(2n . n100)
6. n! = O(3n)
7. n +1 = O(n)
8. 2n+1 = O(2n)
9. (n+1)! = O(n!)
10. 1 + c + c2 +…+ cn = O(cn) for c > 1
11. 1 + c + c2 +…+ cn = O(1) for c < 1
Big-Omega Notation ()
If f, g: N  R+, then we can define Big-Omega as
For a given function g n denote by g n  the set of functions,
g n   f n : there exist positive constants c and no such that
0 cg n   f n for all n n o 
f n  g n , means that function g n is an asymptotic ally
lower bound for f n .

We may write f(n) = (g(n)) OR f(n)  (g(n))

Intuitively:
Set of all functions whose rate of growth is the same as or higher
than that of g(n).
Big-Omega Notation

f(n)  (g(n))

 c > 0,  n0  0 , n  n0, f(n)  c.g(n)


g(n) is an asymptotically lower bound for f(n).
Examples
Example 1: Prove that 5.n2  (n)
Proof:
Assume that f(n) = 5.n2 , and g(n) = n
f(n)  (g(n)) ?
We have to find the existence of c and n0 s.t.
c.g(n) ≤ f(n)  n  n0
c.n ≤ 5.n2  c ≤ 5.n
if we take, c = 5 and n0= 1 then
c.n ≤ 5.n2  n  n0
And hence f(n)  (g(n)), for c = 5 and n0= 1
Examples
Example 2: Prove that 5.n + 10  (n)
Proof:
Assume that f(n) = 5.n + 10, and g(n) =
n
f(n)  (g(n)) ?
We have to find the existence of c and n0
s.t.
c.g(n) ≤ f(n)  n  n0
c.n ≤ 5.n + 10  c.n ≤ 5.n + 10.n  c ≤
15.n
if we take, c = 15 and n0= 1 then
Examples
Example 3: Prove that 100.n + 5  (n2)
Proof:
Let f(n) = 100.n + 5, and g(n) = n2
Assume that f(n)  (g(n)) ?
Now if f(n)  (g(n)) then there exist c and
n0 s.t.
c.g(n) ≤ f(n)  n  n0 
c.n2 ≤ 100.n + 5 
c.n ≤ 100 + 5/n 
n ≤ 100/c, for a very large n, which is not
possible

And hence f(n)  (g(n))


Theta Notation ()

If f, g: N  R+, then we can define Big-Theta as

For a given function g n denoted by g n  the set of functions,


g n   f n : there exist positive constants c1 , c2 and no such that
0 c1 g n   f n  c2 g n for all n n o 
f n  g n  means function f n is equal to g n  to within a constant
factor, and g n is an asymptotic ally tight bound for f n .

We may write f(n) = (g(n)) OR f(n)  (g(n))

Intuitively: Set of all functions that have same rate of growth as g(n).
Theta Notation

f(n)  (g(n))

 c1> 0, c2> 0,  n0  0,  n  n0, c2.g(n)  f(n)  c1.g(n)


We say that g(n) is an asymptotically tight bound for f(n).
Theta Notation
Example 1: Prove that ½.n2 – ½.n = (n2)
Proof
Assume that f(n) = ½.n2 – ½.n, and g(n) = n2
f(n)  (g(n))?
We have to find the existence of c1, c2 and n0
s.t.
c1.g(n) ≤ f(n) ≤ c2.g(n)  n  n0
Since, ½ n2 - ½ n ≤ ½ n2 n ≥ 0 if c2= ½ and
½ n2 - ½ n ≥ ½ n2 - ½ n . ½ n ( n ≥ 2 ) = ¼ n2, c1= ¼
Hence ½ n2 - ½ n ≤ ½ n2 ≤ ½ n2 - ½ n
c1.g(n) ≤ f(n) ≤ c2.g(n) n ≥ 2, c1= ¼, c2 =
½
Theta Notation
Example 1: Prove that 2.n2 + 3.n + 6  (n3)
Proof: Let f(n) = 2.n2 + 3.n + 6, and g(n) = n3
we have to show that f(n)  (g(n))
On contrary assume that f(n)  (g(n)) i.e.
there exist some positive constants c1, c2 and
n0 such that: c1.g(n) ≤ f(n) ≤ c2.g(n)
c1.g(n) ≤ f(n) ≤ c2.g(n)  c1.n3 ≤ 2.n2 + 3.n + 6 ≤ c2.
n3 
c1.n ≤ 2 + 3/n + 6/n2 ≤ c2. n 
c1.n ≤ 2 ≤ c2. n, for large n 
n ≤ 2/c1 ≤ c2/c1.n which is not possible
Hence f(n)  (g(n))  2.n2 + 3.n + 6  (n3)
Little-Oh Notation
o-notation is used to denote a upper bound that is not
asymptotically tight.
For a given function g n  0, denoted by og n  the set of functions,
 f n : for any positive constants c, there exists a constant no
og n  
such that 0  f n   cg n for all n n o 

f(n) becomes insignificant relative to g(n) as n


approaches infinity
f n 
2
 
2 2 lim
e.g., 2n o n but 2n o n .. n   g n  0  
g(n) is an upper bound for f(n), not asymptotically tight
Examples
Example 1: Prove that 2n2  o(n3)
Proof:
Assume that f(n) = 2n2 , and g(n) = n3
f(n)  o(g(n)) ?

Now we have to find the existence n0 for any


c
f(n) < c.g(n) this is true
 2n2 < c.n3  2 < c.n
This is true for any c, because for any
arbitrary c we can choose n0 such that the
above inequality holds.
Hence f(n)  o(g(n))
Examples

Example 2: Prove that n2  o(n2)


Proof:
Assume that f(n) = n2 , and g(n) = n2
Now we have to show that f(n)  o(g(n))

Since
f(n) < c.g(n)  n2 < c.n2  1 ≤ c,

In our definition of small o, it was required


to prove for any c but here there is a
constraint over c . Hence, n2  o(n2),
where c = 1 and n0= 1
Examples

Example 3: Prove that 1000.n2 + 1000.n  o(n2)


Proof:
Assume that f(n) = 1000.n2 + 1000.n, and g(n) =
n2
we have to show that f(n)  o(g(n)) i.e.
We assume that for any c there exist n0 such
that
0 ≤ f(n) < c.g(n)  n  n0
1000.n2 + 1000.n < c.n2
If we take c = 2001, then,1000.n2 + 1000.n <
2001.n2
 1000.n < 1001.n2 which is not true
Little-Omega Notation

Little- notation is used to denote a lower bound


that is not asymptotically tight.
For a given function g n , denote by  g n  the set of all functions.
 g n   f n : for any positive constants c, there exists a constant no such that
0 cg n   f n for all n n o 

f(n) becomes arbitrarily large relative to g(n) as n


approaches infinity
f n 
lim 
n   g n 
2 2
n n
e.g.,
2
 n  but
2
 n ..
2
 
Examples

Example 1: Prove that 5.n2  (n)


Proof:
Assume that f(n) = 5.n2 , and g(n) = n
f(n)  (g(n)) ?
We have to prove that for any c there exists
n0 s.t., c.g(n) < f(n)  n  n0
c.n < 5.n2  c < 5.n
This is true for any c, because for any
arbitrary c e.g. c = 1000000, we can choose
n0 = 1000000/5 = 200000 and the above
inequality does hold.
And hence f(n)  (g(n)),
Examples

Example 2: Prove that 5.n + 10  (n)


Proof:
Assume that f(n) = 5.n + 10, and g(n) = n
f(n)  (g(n)) ?

We have to find the existence n0 for any c, s.t.


c.g(n) < f(n)  n  n0
c.n < 5.n + 10, if we take c = 16 then
16.n < 5.n + 10  11.n < 10 is not true for
any positive integer.
Hence f(n)  (g(n))
Examples

Example 3: Prove that 100.n  (n2)


Proof:
Let f(n) = 100.n, and g(n) = n2
Assume that f(n)  (g(n))
Now if f(n)  (g(n)) then there n0 for any c
s.t.
c.g(n) < f(n)  n  n0 this is true
 c.n2 < 100.n  c.n < 100
If we take c = 100, n < 1, not possible
Hence f(n)  (g(n)) i.e. 100.n  (n2)
Usefulness of Notations
• It is not always possible to determine behaviour of an
algorithm using Θ-notation.
• For example, given a problem with n inputs, we may
have an algorithm to solve it in a.n2 time when n is
even and c.n time when n is odd. OR
• We may prove that an algorithm never uses more than
e.n2 time and never less than f.n time.
• In either case we can neither claim (n) nor (n2) to
be the order of the time usage of the algorithm.
• Big O and  notation will allow us to give at least
partial information
Relations Over Asymptotic Notations
Reflexive Relation
Definition:
• Let X be a non-empty set and R is a relation
over X then R is said to be reflexive if
(a, a)  R,  a  X,
Example 1:
• Let G be a graph. Let us define a relation R over
G as if node x is connected to y then (x, y)  G.
Reflexivity is satisfied over G if for every node
there is a self loop.
Example 2:
• Let P be a set of all persons, and S be a relation
over P such that if (x, y)  S then x has same
birthday as y.
• Of course this relation is reflexive because
(x, x)  S,  a  P,
Reflexivity Relations over , , O
Example 1
Since, 0  f(n)  cf(n)  n  n0 = 1, if c = 1

Hence f(n) = O(f(n))


Example 2
Since, 0  cf(n)  f(n)  n  n0 = 1, if c = 1

Hence f(n) = (f(n))


Example 3
Since, 0  c1f(n)  f(n)  c2f(n)  n  n0 = 1,if c1= c2
=1
Hence f(n) = (f(n))
Little o and  are not Reflexivity Relations
Example
As we can not prove that f(n) < f(n), for any n, and for
all c > 0

Therefore
1. f(n)  o(f(n)) and
2. f(n)  (f(n))

Note :
Hence small o and small omega are not reflexive relations
Symmetry
Definition:
• Let X be a non-empty set and R is a relation
over X then R is said to be symmetric if
 a, b  X, (a, b)  R  (b, a)  R
Example 1:
• Let P be a set of persons, and S be a relation
over P such that if (x, y)  S then x has the
same sign as y.
• This relation is symmetric because
(x, y)  S  (y, x)  S
Example 2:
• Let P be a set of all persons, and B be a
relation over P such that if (x, y)  B then x is
brother of y.
• This relation is not symmetric because
Symmetry over 
Property : prove that
f(n) = (g(n))  g(n) = (f(n))
Proof
• Since f(n) = (g(n)) i.e. f(n)  (g(n)) 
 constants c1, c2 > 0 and n0  N such that
0  c1g(n)  f(n)  c2g(n)  n  n0 (1)
(1)  0  c1g(n)  f(n)  c2g(n)  0  f(n)  c2g(n)
 0  (1/c2)f(n)  g(n) (2)
(1)  0  c1g(n)  f(n)  c2g(n)  0  c1g(n)  f(n)
 0  g(n)  (1/c1)f(n) (3)
Symmetry over Q
From (2),(3): 0  (1/c2)f(n)  g(n)  0  g(n)  (1/c1)f(n)
 0  (1/c2)f(n)  g(n)  (1/c1)f(n)
Suppose that 1/c2 = c3, and 1/c1 = c4,
Now the above equation implies that
0  c3f(n)  g(n)  c4f(n),  n  n0
 g(n) = (f(n)),  n  n0
Hence it proves that,
f(n) = (g(n))  g(n) = (f(n))
Exercise:
prove that big O, big omega , little , and little o, do
not satisfy the symmetry property.
Transitivity
Definition:
• Let X be a non-empty set and R is a relation
over X then R is said to be transitive if
 a, b, c  X, (a, b)  R  (b, c)  R  (a, c)  R
Example 1:
• Let P be a set of all persons, and B be a
relation over P such that if (x, y)  B then x is
brother of y.
• This relation is transitive this is because
(x, y)  B  (y, z)  B  (x, z)  B
Example 2:
• Let P be a set of all persons, and F be a
relation over P such that if (x, y)  F then x is
father of y.
• Of course this relation is not a transitive
Transitivity Relation over Q, W, O, o and 
Prove the following
1. f(n) = (g(n)) & g(n) = (h(n))  f(n) =
(h(n))
2. f(n) = O(g(n)) & g(n) = O(h(n))  f(n) =
O(h(n))
3. f(n) = (g(n)) & g(n) = (h(n))  f(n) =
(h(n))
4. f(n) = o (g(n)) & g(n) = o (h(n))  f(n) = o
(h(n))
5. f(n) = (g(n)) & g(n) = (h(n))  f(n) =
(h(n))
Transitivity Relation over Q
Property 1
f(n) = (g(n)) & g(n) = (h(n))  f(n) = (h(n))
Proof
• Since f(n) = (g(n)) i.e. f(n)  (g(n)) 
 constants c1, c2 > 0 and n01  N such that
0  c1g(n)  f(n)  c2g(n)  n  n01 (1)
2. Now since g(n) = (h(n)) i.e. g(n)  (h(n)) 
 constants c3, c4 > 0 and n02  N such that
0  c3h(n)  g(n)  c4h(n)  n  n02 (2)
3. Now let us suppose that n0 = max (n01, n02)
Transitivity Relation over Q
4. Now we have to show that f(n) = (h(n)) i.e. we have
to prove that
 constants c5, c6 > 0 and n0  N such that
0  c5h(n)  f(n)  c6h(n) ?
(2)  0  c3h(n)  g(n)  c4h(n)
 0  c3h(n)  g(n) (3)
(1)  0  c1g(n)  f(n)  c2g(n)
 0  c1g(n)  f(n)
 0  g(n)  (1/c1)f(n) (4)
From (3) and (4), 0  c3h(n)  g(n)  (1/c1)f(n)
 0  c1c3h(n)  f(n) (5)
Transitivity Relation over Q
(1)  0  c1g(n)  f(n)  c2g(n)
 0  f(n)  c2g(n)  0  (1/c2)f(n)  g(n) (6)
(2)  0  c3h(n)  g(n)  c4h(n)
 0  g(n)  c4h(n) (7)
From (6) and (7), 0  (1/c2)f(n)  g(n)  (c4)h(n)

 0  (1/c2)f(n)  (c4)h(n)
 0  f(n)  c2c4h(n) (8)
From (5), (8), 0  c1c3h(n)  f(n)  0  f(n)  c2c4h(n)
0  c1c3h(n)  f(n)  c2c4h(n)
0  c5h(n)  f(n)  c6h(n)
Transitivity Relation over Big O
Property 2
f(n) = O(g(n)) & g(n) = O(h(n))  f(n) = O(h(n))
Proof
• Since f(n) = O(g(n)) i.e. f(n)  O(g(n)) 
 constants c1 > 0 and n01  N such that
0  f(n)  c1g(n)  n  n01 (1)
2. Now since g(n) = O(h(n)) i.e. g(n)  O(h(n)) 
 constants c2 > 0 and n02  N such that
0  g(n)  c2h(n)  n  n02 (2)
3. Now let us suppose that n0 = max (n01, n02)
Transitivity Relation over Big O
Now we have to two equations
0  f(n)  c1g(n)  n  n01 (1)
0  g(n)  c2h(n)  n  n02 (2)
(2)  0  c1g(n)  c1c2h(n)  n  n02 (3)
From (1) and (3)
0  f(n)  c1g(n)  c1c2h(n)
Now suppose that c3= c1c2
0  f(n)  c1c2h(n)

And hence f(n) = O(h(n))  n  n0


Transitivity Relation over Big 
Property 3
f(n) = (g(n)) & g(n) = (h(n))  f(n) = (h(n))
Proof
• Since f(n) = (g(n)) 
 constants c1 > 0 and n01  N such that
0  c1g(n)  f(n)  n  n01 (1)
2. Now since g(n) = (h(n)) 
 constants c2 > 0 and n02  N such that
0  c2h(n)  g(n)  n  n02 (2)
3. Suppose that n0 = max (n01, n02)
Transitivity Relation over Big 
4. We have to show that f(n) = (h(n)) i.e. we have to
prove that
 constants c3 > 0 and n0  N such that
0  c3h(n)  f(n)  n  n0 ?
(2)  0  c2h(n)  g(n)
(1)  0  c1g(n)  f(n)
 0  g(n)  (1/c1)f(n) (3)
From (2) and (3), 0  c2h(n)  g(n)  (1/c1)f(n)
 0  c1c2h(n)  f(n) hence f(n) = (h(n)),  n  n0
Transitivity Relation over little o
Property 4
f(n) = o(g(n)) & g(n) = o(h(n))  f(n) = o(h(n))
Proof
• Since f(n) = o(g(n)) i.e. f(n)  o(g(n)) 
 constants c1 > 0 and n01  N such that
0  f(n) < c1g(n)  n  n01 (1)
2. Now since g(n) = o(h(n)) i.e. g(n)  o(h(n)) 
 constants c2 > 0 and n02  N such that
0  g(n) < c2h(n)  n  n02 (2)
3. Now let us suppose that n0 = max (n01, n02)
Transitivity Relation over little o
Now we have to two equations
0  f(n) < c1g(n)  n  n01 (1)
0  g(n) < c2h(n)  n  n01 (2)
(2)  0  c1g(n) < c1c2h(n)  n  n02 (3)
From (1) and (3)
0  f(n)  c1g(n) < c1c2h(n)
Now suppose that c3= c1c2
0  f(n) < c1c2h(n)

And hence f(n) = o(h(n))  n  n01


Transitivity Relation over little 
Property 5
f(n) = (g(n)) & g(n) = (h(n))  f(n) = (h(n))
Proof
• Since f(n) = (g(n)) 
 constants c1 > 0 and n01  N such that
0  c1g(n) < f(n)  n  n01 (1)
2. Now since g(n) = (h(n)) 
 constants c2 > 0 and n02  N such that
0  c2h(n) < g(n)  n  n02 (2)
3. Suppose that n0 = max (n01, n02)
Transitivity Relation over little 
4. We have to show that f(n) = (h(n)) i.e. we have to
prove that
 constants c3 > 0 and n0  N such that
0  c3h(n)  f(n)  n  n0 ?
(2)  0  c2h(n) < g(n)
(1)  0  c1g(n) < f(n)
 0  g(n) < (1/c1)f(n) (3)
From (2) and (3), 0  c2h(n)  g(n) < (1/c1)f(n)
 0  c1c2h(n) < f(n) hence f(n) = (h(n)),  n  n0
Transpose Symmetry
Property 1
Prove that f(n) = O(g(n))  g(n) = (f(n))
Proof
Since f(n) = O(g(n)) 
 constants c > 0 and n0  N such that
0  f(n)  cg(n)  n  n0
Dividing both side by c
0  (1/c)f(n)  g(n)  n  n0
Put 1/c = c’
0  c’f(n)  g(n)  n  n0
Hence, g(n) = (f(n))
Transpose Symmetry
Property 2
Prove that f(n) = o(g(n))  g(n) = f(n))
Proof
Since f(n) = o(g(n)) 
 constants c > 0 and n0  N such that
0  f(n) < cg(n)  n  n0
Dividing both side by c
0  (1/c)f(n) < g(n)  n  n0
Put 1/c = c’
0  c’f(n) < g(n)  n  n0
Hence, g(n) = (f(n))
Relation between Q, W , O
Trichotmy property over real numbers
• For any two real numbers a and b, exactly one of the
following must hold: a < b,a = b, or a > b.
The asymptotic comparison of two functions f and g and
the comparison of two real numbers a and b.
Trichotmy property over Q, W and O
1. f (n) = O(g(n))  a ≤ b
2. f (n) =  (g(n))  a  b
3. f (n) =  (g(n))  a = b
4. f (n) = o (g(n))  a < b
5. f (n) = (g(n))  a > b
Some Other Standard Notations
Monotonicity
• monotonically increasing if m  n  f(m)  f(n).
• monotonically decreasing if m  n  f(m)  f(n).
• strictly increasing if m < n  f(m) < f(n).
• strictly decreasing if m < n  f(m) > f(n).
Polynomials
• Given a positive integer d, a polynomial in n of degree
d is a function of the form given below, ai are
coefficient of polynomial.
d
p n   ai n i

i 0
Standard Logarithms Notations
Some Definitions
Exponent
• x = logba is the exponent for a = bx.
Natural log
• ln a = logea
Binary log
• lg a = log2a
Square of log
• lg2a = (lg a)2
Log of Log
• lg lg a = lg (lg a)
Standard Logarithms Notations
a b logb a
log c (ab) log c a  log c b
log b a n nlog b a
log c a
log b a 
log c b
log b (1/a)  log b a
1
log b a 
log a b
a logb c c logb a

You might also like