InterviewBit - Time Complexity
InterviewBit - Time Complexity
The rate of Growth of time taken for program execution is what we are interested in
In algorithmic time complexity, we are mostly interested in the rate of change of size of input
data and not the specific time of the program execution.
We are assuming a model computing machine to specify the hardware and will check for
different input sizes to understand how the time complexity for the problem changes.
● Single Processor
● 32 bit
● Sequential Execution (Parallel Processing is another alternative)
● 1 unit time for arithmetical and logical operations
● 1 unit time for assignment and return
Time Complexity is 2 units (One for addition and one for returning the values)
www.startupanalytics.co.in
Time Complexity of different input type problems
How the time took varies as per the time complexity of the program execution changes.
Asymptotic Notations
We are interested in coming with a generic way to quantify the time taken by an algorithmic
without binding it to a specific machine or hardware. We are broadly interested in the rate of
growth of time of execution and the specifics of machine capabilities.
www.startupanalytics.co.in
The definition is that O(N) of a function : O(g(n)) = f(n) { there exist constants C and N such that
f(n) <= C*g(n) for some n >= N value
Big-O Notation
Here as we can see, for some constant C = 6 or 7 or 8 and so on, for a specific value of n=N,
the rate of growth of function f(n) will always be smaller than the rate of growth of function g(n).
So, the f(n) is 5*(n^2) + 3*(n) + 2 and the corresponding g(n) for constant c = 8 and n = 2, we
the have the big O notation valid with f(n) always growing slower than g(n). Thereby g(n) is an
upper bound on the rate of growth of time complexity and is thereby written here as O(n^2)
Omega Notation
The definition is that O(N) of a function : O(g(n)) = f(n) { there exist constants C and N such that
N*g(n) <= f(n) for some n >= N value
For the above case by having C = 5, from N=0 onwards, g(n) remains 5*(n^2), while f(n) is
5*(n^2) + 3*(n) + 2. Thereby g(n) is a lower bound on the rate of growth of f(n).
www.startupanalytics.co.in
Theta Notation
The definition is that O(N) of a function : O(g(n)) = f(n) { there exist constants C1, C2 and N
such that f(n) is bound at lower and upper as C1*(n^2) and C2*(n^2)} . This acts as the best way
to understand the time complexity since we have a tight range.
● Large numbers
● Worst Case Scenario
For the rate of growth as a polynomial function such as n^3 + 3^n^2 + 4^n + 2, the lower order
of polynomials such as n^2 and n in this case, won’t matter when the input size is very large.
Thus the O(N) complexity, in this case, is n^3
This holds for non-polynomial functions as well as such as n^2 + log(n). In this case, the O(N)
complexity is n^2. This is the case because log(N) grows very slow.
Time complexity is the cumulative sum of time of execution for each of the fragments in the
code piece.
O(N^2) complexity
If all these operations were part of the same code piece, the combined time complexity will be
O(N^2) + O(N) + O(1). The Big O value will be O(N^2).
www.startupanalytics.co.in
Complexity Analysis : Recursive Programs
Factorial (N)
{
if (N==0):
Return 1
else :
Return N*Factorial(N-1)
Time complexity, in this case, is given as T(N) = T(N-1) + 3 (3 is represented by the red
coloured operations)
This T(N) needs to be reduced to its time complexity form. T(N-1) can be written as T(N-2) + 6,
this can be continued.
T(N) = T(N-1) + 3
= T(N-2) + 6
= T(N-3) + 9
This can thereby be written as 3*N + 1 (This 1 is for the N==0 condition)
Space Complexity
The space complexity can be understood in form of a stack memory. As the computer needs to
store the different states of the computation in a recursive program. The stack memory looks
something like below.
F(0)
F(1)
F(2)
F(3)
F(4)
F(5)
www.startupanalytics.co.in
The length or size of the recursive tree is the length of the entire of operations set which is F(0)
to F(6), 6 in length. The space complexity thereby is 6. Space complexity is directly proportioned
to the number N. This thereby in O(N) format is thereby N.
i = 2
while i < N
if N is divisible by i
N is not prime
add 1 to i
i = 2
while i < square root of N
if N is divisible by i
N is not prime
add 1 to i
Now, B claims that his code is much better as it takes much less time to check if N is prime.
www.startupanalytics.co.in
Example 1 :
N = 1000033 ( Prime number )
Time is taken by A's program = 1 ms * number of divisions
= 1 ms * 1000033
= approximately 1000 seconds or 16.7 mins.
Example 2 :
N = 1500450271 ( Prime number )
Time is taken by A's program = 1 ms * number of divisions
= 1 ms * 1500450271
= approximately 1000000 seconds or 11.5 days
As you can see B’s program is significantly faster even though both methods of solving the
problem are correct.
This is where time complexity of programs comes in, which is a measure of how efficient (
or quick ) a program is for large inputs.
In the first case, the time taken is directly proportional to N, whereas in the second case it is
directly proportional to square root of N. In later slides, we will look into how we can
formalize it into time complexity notations.
www.startupanalytics.co.in