AAA-5
AAA-5
AAA-5
(Asymptotic Notations)
What is Complexity?
• The level in difficulty in solving mathematically
posed problems as measured by
– The time
(time complexity)
– number of steps or arithmetic operations
(computational complexity)
– memory space required
– (space complexity)
Major Factors in Algorithms Design
1. Correctness
An algorithm is said to be correct if
• For every input, it halts with correct output.
• An incorrect algorithm might not halt at all OR
• It might halt with an answer other than desired one.
• Correct algorithm solves a computational problem
2. Algorithm Efficiency
Measuring efficiency of an algorithm,
• do its analysis i.e. growth rate.
• Compare efficiencies of different algorithms for the
same problem.
Algorithms Growth Rate
Algorithm Growth Rates
• It measures algorithm efficiency
What means by efficient?
▪ If running time is bounded by polynomial in the input
Notations for Asymptotic performance
• How running time increases with input size
• O, Omega, Theta, etc. for asymptotic running time
• These notations defined in terms of functions whose
domains are natural numbers
• convenient for worst case running time
• Algorithms, asymptotically efficient best choice
Complexity Analysis
• Algorithm analysis means predicting resources such as
– computational time
– memory
– computer hardware etc
• Worst case analysis
– Provides an upper bound on running time
– An absolute guarantee
• Average case analysis
– Provides the expected running time
– Very useful, but treat with care: what is “average”?
• Random (equally likely) inputs
• Real-life inputs
Worst-case Analysis
Let us suppose that
• Dn = set of inputs of size n for the problem
• I = an element of Dn.
• t(I) = number of basic operations performed on I
• Define a function W by
W(n) = max{t(I) | I Dn}
called the worst-case complexity of the algorithm
• W(n) is the maximum number of basic operations
performed by the algorithm on any input of size n.
• Please note that the input, I, for which an algorithm
behaves worst depends on the particular algorithm.
Average Complexity
• Let Pr(I) be the probability that input I occurs.
• Then the average behavior of the algorithm is defined as
A(n) = Pr(I) t(I), summation over all I Dn
• We determine t(I) by analyzing the algorithm, but Pr(I)
cannot be computed analytically.
• Average cost =
A(n) = Pr(succ)Asucc(n) + Pr(fail)Afail(n)
• An element I in Dn may be thought as a set or
equivalence class that affect the behavior of the algorithm
Worst Analysis computing average cost
• Take all possible inputs, compute their cost, take average
Asymptotic Notations Properties
• Categorize algorithms based on asymptotic growth
rate e.g. linear, quadratic, polynomial, exponential
• Ignore small constant and small inputs
• Estimate upper bound and lower bound on growth
rate of time complexity function
• Describe running time of algorithm as n grows to .
• Describes behavior of function within the limit.
Limitations
• not always useful for analysis on fixed-size inputs.
• All results are for sufficiently large inputs.
Asymptotic Notations
Asymptotic Notations , O, , o,
▪ We use to mean “order exactly”,
▪ O to mean “order at most”,
▪ to mean “order at least”,
▪ o to mean “tight upper bound”,
• to mean “tight lower bound”,
f(n) O(g(n))
Since
f(n) ≤ c.g(n) → n2 ≤ c.n2 → 1 ≤ c, take, c = 1, n0=
1
Then
n2 ≤ c.n2 for c = 1 and n 1
Hence, 2n2 O(n2), where c = 1 and n0= 1
Examples
Examples
Example 3: Prove that 1000.n2 + 1000.n O(n2)
Proof:
Assume that f(n) = 1000.n2 + 1000.n, and g(n) = n2
We have to find existence of c and n0 such that
0 ≤ f(n) ≤ c.g(n) n n0
1000.n2 + 1000.n ≤ c.n2 = 1001.n2, for c = 1001
1000.n2 + 1000.n ≤ 1001.n2
1000.n ≤ n2 n2 1000.n n2 - 1000.n 0
n (n-1000) 0, this true for n 1000
f(n) ≤ c.g(n) n n0 and c = 1001
Intuitively:
Set of all functions whose rate of growth is the same as or higher
than that of g(n).
Big-Omega Notation
f(n) (g(n))
Intuitively: Set of all functions that have same rate of growth as g(n).
Theta Notation
f(n) (g(n))
2 2
g ( n) = n 2
c .g ( n ) f ( n ) c .g ( n ) n n
We have to findf (then ) c .gexistence
1
(n) 2
2
of c1, c2 and n0 s.t.
o
for all n n0
½.n – ½.n ½.n -½.n. ½.n ½.n − 1/ 4.n = −1 / 2n
c1.g(n) ≤ f(n) ≤ c2.g(n)
2 2 2 2 2
( ) lim
e.g., 2n = o n but 2n o n .. n → g (n )
2 2 2 = 0 ( )
g(n) is an upper bound for f(n), not asymptotically tight
Examples
Examples
Example 1: Prove that 2n2 o(n3)
Proof:
Assume that f(n) = 2n2 , and g(n) = n3
f(n) o(g(n)) ?
Since
f(n) < c.g(n) n2 < c.n2 1 ≤ c,
( )
2 2
= (n ) but
n n
e.g., n ..
2
2 2
Examples
Examples
Example 1: Prove that 5.n2 (n)
Proof:
Assume that f(n) = 5.n2 , and g(n) = n
f(n) (g(n)) ?
We have to prove that for any c there exists n0 s.t.,
c.g(n) < f(n) for all n n0
c.n < 5.n2 c < 5.n
This is true for any c, because for any arbitrary c
e.g. c = 1000000, we can choose n0 = 1000000/5
= 200000 and the above inequality does hold.
And hence f(n) (g(n)),
Examples
Examples