How To Find Time Complexity of An Algorithm - Stack Overflow
How To Find Time Complexity of An Algorithm - Stack Overflow
The Question
What do I know ?
int i=0; This will be executed only once. The time is actually calculated to i=0 and not
the declaration.
{1+(N+1)+N} = 2N+2
Ok, so these small basic calculations I think I know, but in most cases I have seen the
time complexity as
https://stackoverflow.com/questions/11032015/how-to-find-time-complexity-of-an-algorithm?noredirect=1&lq=1 1/15
11/23/2020 How to find time complexity of an algorithm - Stack Overflow
Can anyone help me understand how does one calculate time complexity of an
algorithm? I am sure there are plenty of newbies like me wanting to know this.
148 Bonus for those interested: The Big O Cheat Sheet bigocheatsheet.com – msanford Jun 9 '13
at 22:12
4 Check this blog out: mohalgorithmsorbit.blogspot.com. It covers both recursive and (especially)
iterative algorithms. – Mohamed Ennahdi El Idrissi Mar 5 '15 at 0:10
1 why is Console.Write('Hello World !'); not a machine instruction? – Chetan Jul 10 '17 at 11:02
1 @Chetan If you mean that you should consider Console.Write when calculating the
complexity, that's true, but also somewhat irrelevant in this case, as that only changes a
constant factor, which big-O ignores (see the answers), so the end result is still a complexity of
O(N). – Bernhard Barker Dec 24 '17 at 17:39
1 Related: What is a plain English explanation of "Big O" notation? and Is there a system behind
the magic of algorithm analysis? – Bernhard Barker Oct 31 '18 at 11:00
411
You add up how many machine instructions it will execute as a function of the size of its
input, and then simplify the expression to the largest (when N is very large) term and can
include any simplifying constant factor.
For example, lets see how we simplify 2N + 2 machine instructions to describe this as
just O(N) .
What is the relative influence of these two terms as N becomes large? Suppose N is a
million.
Then the first term is 2 million and the second term is only 2.
https://stackoverflow.com/questions/11032015/how-to-find-time-complexity-of-an-algorithm?noredirect=1&lq=1 2/15
11/23/2020 How to find time complexity of an algorithm - Stack Overflow
For this reason, we drop all but the largest terms for large N.
This means that we don't really care if there is some constant multiple of difference in
performance when N is large. The unit of 2N is not well-defined in the first place anyway.
So we can multiply or divide by a constant factor to get to the simplest expression.
So 2N becomes just N .
57 hey thanks for letting me know "why O(2N+2) to O(N)" very nicely explained, but this was only a
part of the bigger question, I wanted someone to point out to some link to a hidden resource or in
general I wanted to know how to do you end up with time complexities like O(N), O(n2), O(log n),
O(n!), etc.. I know I may be asking a lot, but still I can try :{) – Yasser Shaikh Jun 14 '12 at 11:33
3 Well the complexity in the brackets is just how long the algorithm takes, simplified using the
method I have explained. We work out how long the algorithm takes by simply adding up the
number of machine instructions it will execute. We can simplify by only looking at the busiest
loops and dividing by constant factors as I have explained. – Andrew Tomazos Jun 14 '12 at
11:36
4 Giving an in-answer example would have helped a lot, for future readers. Just handing over a link
for which I have to signup, really doesn't help me when I just want to go through some nicely
explained text. – bad_keypoints Jan 2 '16 at 4:48
2 I would suggest to watch Dr. Naveen Garg(IIT Delhi Prof.) videos if you want to get good
knowledge on DS and Time complexity.check the link.nptel.ac.in/courses/106102064 –
Rohit Bandil Oct 1 '16 at 16:45
2 (cont.) This hierarchy would have a height on the order of log N. As for O(N!) my analogies won't
likely cut it, but permutations are on that order - it's prohibitively steep, more so than any
polynomial or exponential. There are exactly 10! seconds in six weeks but the universe is less
than 20! seconds old. – John P Feb 25 '18 at 2:59
The most common metric for calculating time complexity is Big O notation. This removes
all constant factors so that the running time can be estimated in relation to N as N
approaches infinity. In general you can think of it like this:
statement;
https://stackoverflow.com/questions/11032015/how-to-find-time-complexity-of-an-algorithm?noredirect=1&lq=1 3/15
11/23/2020 How to find time complexity of an algorithm - Stack Overflow
Is constant. The running time of the statement will not change in relation to N.
Is linear. The running time of the loop is directly proportional to N. When N doubles, so
does the running time.
Is quadratic. The running time of the two loops is proportional to the square of N. When N
doubles, the running time increases by N * N.
Is logarithmic. The running time of the algorithm is proportional to the number of times N
can be divided by 2. This is because the algorithm divides the working area in half with
each iteration.
Is N * log ( N ). The running time consists of N loops (iterative or recursive) that are
logarithmic, thus the algorithm is a combination of linear and logarithmic.
In general, doing something with every item in one dimension is linear, doing something
with every item in two dimensions is quadratic, and dividing the working area in half is
logarithmic. There are other Big O measures such as cubic, exponential, and square
root, but they're not nearly as common. Big O notation is described as O ( <type> )
where <type> is the measure. The quicksort algorithm would be described as O ( N *
log ( N ) ) .
Note that none of this has taken into account best, average, and worst case measures.
Each would have its own Big O notation. Also note that this is a VERY simplistic
explanation. Big O is the most common, but it's also more complex that I've shown.
https://stackoverflow.com/questions/11032015/how-to-find-time-complexity-of-an-algorithm?noredirect=1&lq=1 4/15
11/23/2020 How to find time complexity of an algorithm - Stack Overflow
There are also other notations such as big omega, little o, and big theta. You probably
won't encounter them outside of an algorithm analysis course. ;)
10 The quicksort algorithm in the worst case has a running time of N^2, though this behaviour is
rare. – nbro Mar 4 '15 at 8:23
2 IIRC, little o and big omega are used for best and average case complexity (with big O being
worst case), so "best, average, and worst case measures. Each would have its own Big O
notation." would be incorrect. There are even more symbols with more specific meanings, and
CS isn't always using the most appropriate symbol. I came to learn all of these by the name
Landau symbols btw. +1 anyways b/c best answer. – hiergiltdiestfu May 8 '15 at 7:43
@hiergiltdiestfu Big-O, Big-Omega, etc. can be applied to any of the best, average or worst case
running times of an algorithm. How do O and Ω relate to worst and best case? – Bernhard Barker
Dec 17 '17 at 9:34
Also, if anyone is looking for how to calculate big O for any method:
stackoverflow.com/a/60354355/4260691 – OhadM Feb 22 at 17:13
179
1. Introduction
In computer science, the time complexity of an algorithm quantifies the amount of time
taken by an algorithm to run as a function of the length of the string representing the
input.
2. Big O notation
The time complexity of an algorithm is commonly expressed using big O notation, which
excludes coefficients and lower order terms. When expressed this way, the time
complexity is said to be described asymptotically, i.e., as the input size goes to infinity.
For example, if the time required by an algorithm on all inputs of size n is at most 5n3 +
3n, the asymptotic time complexity is O(n3). More on that later.
1 = O(n)
n = O(n2)
log(n) = O(n)
2 n + 1 = O(n)
https://stackoverflow.com/questions/11032015/how-to-find-time-complexity-of-an-algorithm?noredirect=1&lq=1 5/15
11/23/2020 How to find time complexity of an algorithm - Stack Overflow
Examples:
Consider the following examples, below I am linearly searching for an element, this has a
time complexity of O(n).
More Examples:
Recall the "twenty questions" game - the task is to guess the value of a hidden number in
an interval. Each time you make a guess, you are told whether your guess is too high or
too low. Twenty questions game implies a strategy that uses your guess number to halve
the interval size. This is an example of the general problem-solving method known as
binary search
https://stackoverflow.com/questions/11032015/how-to-find-time-complexity-of-an-algorithm?noredirect=1&lq=1 6/15
11/23/2020 How to find time complexity of an algorithm - Stack Overflow
Examples:
Bubble Sort
Selection Sort
Insertion Sort
4 O(n2) should be written O(n^2) to avoid confusion. – Rizki Hadiaturrasyid Jan 4 '19 at 7:25
Although there are some good answers for this question. I would like to give another
answer here with several examples of loop .
104
O(n): Time Complexity of a loop is considered as O(n) if the loop variables is
incremented / decremented by a constant amount. For example following functions
have O(n) time complexity.
O(n^c): Time complexity of nested loops is equal to the number of times the
innermost statement is executed. For example the following sample loops have
O(n^2) time complexity
https://stackoverflow.com/questions/11032015/how-to-find-time-complexity-of-an-algorithm?noredirect=1&lq=1 7/15
11/23/2020 How to find time complexity of an algorithm - Stack Overflow
}
}
For example Selection sort and Insertion Sort have O(n^2) time complexity.
O(Logn) Time Complexity of a loop is considered as O(Logn) if the loop variables is
divided / multiplied by a constant amount.
int fun(int n)
{
for (int i = 1; i <= n; i++)
{
for (int j = 1; j < n; j += i)
{
// Some O(1) task
}
}
}
Analysis:
https://stackoverflow.com/questions/11032015/how-to-find-time-complexity-of-an-algorithm?noredirect=1&lq=1 8/15
11/23/2020 How to find time complexity of an algorithm - Stack Overflow
…………………………………………………….
For i = n, the inner loop is executed approximately n/n times.
So the total time complexity of the above algorithm is (n + n/2 + n/3 + … + n/n) , Which
becomes n * (1/1 + 1/2 + 1/3 + … + 1/n)
The important thing about series (1/1 + 1/2 + 1/3 + … + 1/n) is equal to O(Logn). So the
time complexity of the above code is O(nLogn).
Ref: 1 2 3
1 @Simon, Could you please figure out which part is incorrect? – zangw Sep 19 '19 at 12:39
thanks for asking. I misread the code. I deleted my comment. Sorry! – Simon Sep 19 '19 at 18:32
Example :
read(x) // O(1)
a = 10; // O(1)
a = 1.000.000.000.000.000.000 // O(1)
2 - If then else statement: Only taking the maximum running time from two or more
possible statements.
Example:
So, the complexity of the above pseudo code is T(n) = 2 + 1 + max(1, 1+2) = 6. Thus, its
big oh is still constant T(n) = O(1).
https://stackoverflow.com/questions/11032015/how-to-find-time-complexity-of-an-algorithm?noredirect=1&lq=1 9/15
11/23/2020 How to find time complexity of an algorithm - Stack Overflow
3 - Looping (for, while, repeat): Running time for this statement is the number of looping
multiplied by the number of operations inside that looping.
Example:
total = 0; // 1
for i = 1 to n do begin // (1+1)*n = 2n
total = total + i; // (1+1)*n = 2n
end;
writeln(total); // 1
4 - Nested Loop (looping inside looping): Since there is at least one looping inside the
main looping, running time of this statement used O(n^2) or O(n^3).
Example:
1. O(1) – Constant Time Constant time means the running time is constant, it’s not
affected by the input size.
2. O(n) – Linear Time When an algorithm accepts n input size, it would perform n
operations as well.
3. O(log n) – Logarithmic Time Algorithm that has running time O(log n) is slight faster
than O(n). Commonly, algorithm divides the problem into sub problems with the
same size. Example: binary search algorithm, binary conversion algorithm.
4. O(n log n) – Linearithmic Time This running time is often found in "divide & conquer
algorithms" which divide the problem into sub problems recursively and then merge
them in n time. Example: Merge Sort algorithm.
5. O(n2) – Quadratic Time Look Bubble Sort algorithm!
7. O(2n) – Exponential Time It is very slow as input get larger, if n = 1000.000, T(n)
would be 21000.000. Brute Force algorithm has this running time.
8. O(n!) – Factorial Time THE SLOWEST !!! Example : Travel Salesman Problem
(TSP)
https://stackoverflow.com/questions/11032015/how-to-find-time-complexity-of-an-algorithm?noredirect=1&lq=1 10/15
11/23/2020 How to find time complexity of an algorithm - Stack Overflow
Taken from this article. Very well explained should give a read.
In your 2nd example, you wrote visitors = visitors + 1 is 1 + 1 = 2 . Could you please
explain to me why you did that? – Sajib Acharya Dec 31 '15 at 9:09
3 @Sajib Acharya Look it from right to left. First step: calculate visitors + 1 Second step: assign
value from first step to visitors So, above expression is formed of two statements; first step +
second step => 1+1=2 – Bozidar Sikanjic Jan 12 '16 at 9:46
@BozidarSikanjic Why it is 1+1 in age = read(x) // (1+1) = 2 – Humty Mar 13 '17 at 19:34
When you're analyzing code, you have to analyse it line by line, counting every
operation/recognizing time complexity, in the end, you have to sum it to get whole
41 picture.
For example, you can have one simple loop with linear complexity, but later in that same
program you can have a triple loop that has cubic complexity, so your program will have
cubic complexity. Function order of growth comes into play right here.
Let's look at what are possibilities for time complexity of an algorithm, you can see order
of growth I mentioned above:
int p = 0;
for (int i = 1; i < N; i++)
p = p + 2;
Cubic, order of growth N^3 , classic example is a triple loop where you check all
triplets:
https://stackoverflow.com/questions/11032015/how-to-find-time-complexity-of-an-algorithm?noredirect=1&lq=1 11/15
11/23/2020 How to find time complexity of an algorithm - Stack Overflow
int x = 0;
for (int i = 0; i < N; i++)
for (int j = 0; j < N; j++)
for (int k = 0; k < N; k++)
x = x + 2
Exponential, order of growth 2^N , usually occurs when you do exhaustive search,
for example check subsets of some set.
If this was the case, what would be the complexity? for (int i = 0; i < N; i++) for (int j = i+1; j < N;
j++) for (int k = j+1; k < N; k++) x = x + 2 – user3156040 Jan 24 at 0:35
O(N)
When you arrive at the party, you have to shake everyone's hand (do an operation on
every item). As the number of attendees N increases, the time/work it will take you to
shake everyone's hand increases as O(N) .
There's variation in the amount of time it takes to shake hands with people. You could
average this out and capture it in a constant c . But the fundamental operation here ---
shaking hands with everyone --- would always be proportional to O(N) , no matter what c
was. When debating whether we should go to a cocktail party, we're often more
interested in the fact that we'll have to meet everyone than in the minute details of what
those meetings look like.
O(N^2)
The host of the cocktail party wants you to play a silly game where everyone meets
everyone else. Therefore, you must meet N-1 other people and, because the next
person has already met you, they must meet N-2 people, and so on. The sum of this
series is x^2/2+x/2 . As the number of attendees grows, the x^2 term gets big fast, so we
just drop everything else.
O(N^3)
https://stackoverflow.com/questions/11032015/how-to-find-time-complexity-of-an-algorithm?noredirect=1&lq=1 12/15
11/23/2020 How to find time complexity of an algorithm - Stack Overflow
You have to meet everyone else and, during each meeting, you must talk about everyone
else in the room.
O(1)
The host wants to announce something. They ding a wineglass and speak loudly.
Everyone hears them. It turns out it doesn't matter how many attendees there are, this
operation always takes the same amount of time.
O(log N)
The host has laid everyone out at the table in alphabetical order. Where is Dan? You
reason that he must be somewhere between Adam and Mandy (certainly not between
Mandy and Zach!). Given that, is he between George and Mandy? No. He must be
between Adam and Fred, and between Cindy and Fred. And so on... we can efficiently
locate Dan by looking at half the set and then half of that set. Ultimately, we look at
O(log_2 N) individuals.
O(N log N)
You could find where to sit down at the table using the algorithm above. If a large number
of people came to the table, one at a time, and all did this, that would take O(N log N)
time. This turns out to be how long it takes to sort any collection of items when they must
be compared.
Best/Worst Case
You arrive at the party and need to find Inigo - how long will it take? It depends on when
you arrive. If everyone is milling around you've hit the worst-case: it will take O(N) time.
However, if everyone is sitting down at the table, it will take only O(log N) time. Or
maybe you can leverage the host's wineglass-shouting power and it will take only O(1)
time.
Assuming the host is unavailable, we can say that the Inigo-finding algorithm has a
lower-bound of O(log N) and an upper-bound of O(N) , depending on the state of the
party when you arrive.
The same ideas can be applied to understanding how algorithms use space or
communication.
Knuth has written a nice paper about the former entitled "The Complexity of Songs".
PROOF: (due to Casey and the Sunshine Band). Consider the songs Sk defined
by (15), but with
https://stackoverflow.com/questions/11032015/how-to-find-time-complexity-of-an-algorithm?noredirect=1&lq=1 13/15
11/23/2020 How to find time complexity of an algorithm - Stack Overflow
for all k.
You nailed it, Now whenever I go to a cocktail party I will subconsciously try finding Time
Complexity of any fun events. Thanks for such a humorous example. – Sabunkar Tejas Sahailesh
Apr 4 at 1:54
I know this question goes a way back and there are some excellent answers here,
nonetheless I wanted to share another bit for the mathematically-minded people that will
5 stumble in this post. The Master theorem is another usefull thing to know when studying
complexity. I didn't see it mentioned in the other answers.
O(n) is big O notation used for writing time complexity of an algorithm. When you add up
the number of executions in an algoritm you'll get an expression in result like 2N+2, in
2 this expression N is the dominating term(the term having largest effect on expression if
its value increases or decreases). Now O(N) is the time comlexity while N is dominating
term. Example
For i= 1 to n;
j= 0;
while(j<=n);
j=j+1;
here total number of executions for inner loop are n+1 and total number of executions for
outer loop are n(n+1)/2, so total number of executions for whole algorithm are
n+1+n(n+1/2) = (n^2+3n)/2. here n^2 is the dominating term so the time complexity for
this algorithm is O(n^2)
https://stackoverflow.com/questions/11032015/how-to-find-time-complexity-of-an-algorithm?noredirect=1&lq=1 14/15
11/23/2020 How to find time complexity of an algorithm - Stack Overflow
Highly active question. Earn 10 reputation in order to answer this question. The reputation
requirement helps protect this question from spam and non-answer activity.
https://stackoverflow.com/questions/11032015/how-to-find-time-complexity-of-an-algorithm?noredirect=1&lq=1 15/15