05 - Data Structures - Asymptotic Analysis
05 - Data Structures - Asymptotic Analysis
Analysis
Asymptotic Notations
Following are the commonly used asymptotic notations to calculate the running time
complexity of an algorithm.
Ο − Big Oh Notation
Θ − Theta Notation
o − Little Oh Notation
ω − Little Omega Notation
Big Oh Notation, Ο
The notation Ο(n) is the formal way to express the upper bound of an algorithm's
running time. It measures the worst case time complexity or the longest amount of
time an algorithm can possibly take to complete.
Ο(f(n)) = { g(n) : there exists c > 0 and n0 such that g(n) ≤ c.f(n) for all n > n0. }
Example
Considering g(n) = n3
The notation Ω(n) is the formal way to express the lower bound of an algorithm's
running time. It measures the best case time complexity or the best amount of time
an algorithm can possibly take to complete.
Ω(f(n)) ≥ { g(n) : there exists c > 0 and n0 such that g(n) ≤ c.f(n) for all n > n0. }
Example
Theta Notation, θ
The notation θ(n) is the formal way to express both the lower bound and the upper
bound of an algorithm's running time. Some may confuse the theta notation as the
average case time complexity; while big theta notation could be almost accurately
used to describe the average case, other notations could be used as well. It is
represented as follows −
θ(f(n)) = { g(n) if and only if g(n) = Ο(f(n)) and g(n) = Ω(f(n)) for all n > n0. }
Example
The Little Oh and Little Omega notations also represent the best and worst case
complexities but they are not asymptotically tight in contrast to the Big Oh and Big
Omega Notations. Therefore, the most commonly used notations to represent time
complexities are Big Oh and Big Omega Notations only.
constant − O(1)
logarithmic − O(log n)
linear − O(n)
quadratic − O(n2)
cubic − O(n3)
polynomial − nO(1)
exponential − 2O(n)