Learn X in Y Minutes - Scenic Programming Language Tours
Learn X in Y Minutes - Scenic Programming Language Tours
Learn X in Y Minutes - Scenic Programming Language Tours
url=https%3A%2F%2Flearnxinyminutes.com%2Fdocs%2Fasymptotic-
notation%2F&text=Learn+X+in+Y+minutes%2C+where+X%3DAsymptotic+Notation)
Asymptotic Notations
What are they?
Asymptotic Notations are languages that allow us to analyze an algorithm’s
running time by identifying its behavior as the input size for the algorithm
increases. This is also known as an algorithm’s growth rate. Does the algorithm
suddenly become incredibly slow when the input size grows? Does it mostly
maintain its quick run time as the input size increases? Asymptotic Notation gives
us the ability to answer these questions.
In the first section of this doc, we described how an Asymptotic Notation identifies
the behavior of an algorithm as the input size changes. Let us imagine an
algorithm as a function f, n as the input size, and f(n) being the running time. So
for a given algorithm f, with input size n you get some resultant run time f(n). This
results in a graph where the Y-axis is the runtime, the X-axis is the input size, and
plot points are the resultants of the amount of time for a given input size.
Logarithmic - log n
Linear - n
Quadratic - n^2
Polynomial - n^z, where z is some constant
Exponential - a^n, where a is some constant
Big-O
Example 1
Is f(n) O(g(n))? Is 3 log n + 100 O(log n)? Let’s look to the definition of Big-O.
Yes! The definition of Big-O has been met therefore f(n) is O(g(n)).
Example 2
f(n) = 3*n^2
g(n) = n
3 * n^2 <= c * n
Is there some pair of constants c, n0 that satisfies this for all n > n0? No, there isn’t.
f(n) is NOT O(g(n)).
Big-Omega
f(n) is Ω(g(n)), if for some real constants c (c > 0) and n0 (n0 > 0), f(n) is >= c
g(n) for every input size n (n > n0).
Note
The asymptotic growth rates provided by big-O and big-omega notation may or
may not be asymptotically tight. Thus we use small-o and small-omega notation to
denote bounds that are not asymptotically tight.
Small-o
f(n) is o(g(n)), if for all real constants c (c > 0) and n0 (n0 > 0), f(n) is < c g(n) for
every input size n (n > n0).
The definitions of O-notation and o-notation are similar. The main difference is
that in f(n) = O(g(n)), the bound f(n) <= g(n) holds for some constant c > 0, but in
f(n) = o(g(n)), the bound f(n) < c g(n) holds for all constants c > 0.
Small-omega
f(n) is ω(g(n)), if for all real constants c (c > 0) and n0 (n0 > 0), f(n) is > c g(n)
for every input size n (n > n0).
The definitions of Ω-notation and ω-notation are similar. The main difference is
that in f(n) = Ω(g(n)), the bound f(n) >= g(n) holds for some constant c > 0, but in
f(n) = ω(g(n)), the bound f(n) > c g(n) holds for all constants c > 0.
Theta
f(n) is Θ(g(n)), if for some real constants c1, c2 and n0 (c1 > 0, c2 > 0, n0 > 0), c1
g(n) is < f(n) is < c2 g(n) for every input size n (n > n0).
Endnotes
It’s hard to keep this kind of topic short, and you should go through the books and
online resources listed. They go into much greater depth with definitions and
examples. More where x=‘Algorithms & Data Structures’ is on its way; we’ll have a
doc up on analyzing actual code examples soon.
Books
Algorithms (http://www.amazon.com/Algorithms-4th-Robert-
Sedgewick/dp/032157351X)
Algorithm Design (http://www.amazon.com/Algorithm-Design-Foundations-
Analysis-Internet/dp/0471383651)
Online Resources
MIT (http://web.mit.edu/16.070/www/lecture/big_o.pdf)
KhanAcademy (https://www.khanacademy.org/computing/computer-
science/algorithms/asymptotic-notation/a/asymptotic-notation)
Big-O Cheatsheet (http://bigocheatsheet.com/) - common structures,
operations, and algorithms, ranked by complexity.