Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Big Oh

Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

O Notation (Big Oh)

We want to give an upper bound on the amount of time it takes to solve a problem.

defn: v(n) = O(f(n)) ↔ ∃ constants c and n0 such that |v(n)| ≤ c|f(n)| whenever n > n0

Termed complexity: has nothing to do with difficulty of coding or understanding, just time to execute.

Important tool for analyzing and describing the behavior of algorithms

Is an n2 algorithm always better than a n3 algorithm? No, it depends on the specific constants, but if n
is large enough, an O(n3) is always slower.

Complexity Class: O(1), O(log n), O(n), O(n log n), O(n2) O(2n)

For small n, c may dominate.

Intractable: all known algorithms are of exponential complexity

Measuring Complexity
1. Additive Property: If two statements follow one another, the complexity of the two of them is
the larger complexity.
O(m) + O(n) = O(max(n,m))
2. If/Else: For the fragment
3. if cond then S1
else S2

The complexity is the running time of the cond plus the larger of the complexities of S1 and S2.

4. Multiplicative Property:
For Loops: the complexity of a for loop is at most the complexity of the statements inside the
for loop times the number of iterations. However, if each iteration is different in length, it is
more accurate to sum the individual lengths rather than just multiply.

Nested For Loops


Analyze nested for loops inside out. The total complexity of a statement inside a group of nested for
loops is the complexity of the statement multiplied by the product of the size of each for loop.

Example 1
for i = 1,m
for j = 1,n
x++;
O(mn)

Consider a pictorial view of the work. Let each row represent the work done in an iteration. The area
of the figure will then represent the total work.

Example 2
for (beg = 1;beg < m; beg++)
for (j = beg; j < m; j++)
x++;

In this example, our complexity picture is triangular.

Recursive Algorithms
The complexity for recursive algorithms requires additional techniques. There are three general
techniques for finding complexity. The pictorial approach, the formulaic approach, and the
mathematics approach. You should be able to use any two of them.

Example 3
procedure B(N)
for i = 1, N
x = x + i
Call B(N/2)
Call B(N/2)
end procedure

If we let T(n) represent the time to solve this problem, the time is represented recursively as T(n) = 2
T(n/2) +n. This is called a recurrence relation
In our pictorial view, we let each line represent a layer of recursions (The first call is the first row, the
two calls at the second level comprise the second row, the four third level calls represent the third
row.) The picture represents the call itself (ignoring costs incurred by the recursive calls). In other
words, to determine the size of the first row, measure how much work is done in that call not counting
the recursive calls it makes.

Example 4
procedure B(N)
for i = 1, N
x = x + i
Call B(N/2)
end procedure

If we let T(n) represent the time to solve this problem, the time is represented recursively as T(n) = 1
T(n/2) +n.
Example 5
procedure B(N)
x = x + i
Call B(N/2)
end procedure

If we let T(n) represent the time to solve this problem, the time is represented recursively as
T(n) = T(n/2) +1.

Example 6
procedure B(N)
x = x + i
Call B(N/2)
Call B(N/2)
end procedure

If we let T(n) represent the time to solve this problem, the time is represented recursively as
T(n) = 2T(n/2) +1.

A Formula Approach
Mathematicians have developed a formula approach to determining complexity.

Theorem:

T(n)= a T(n/b) + O(nk)

if a > bk the complexity is O(n logb a)

if a= bk the complexity is O(nk log n)


if a < bk the complexity is O(nk)

In this case:

• a represents the number of recursive calls you make


• b represents the number of pieces you divide the problem into
• k represents the power on n which represents the work you do in breaking the problem into two
pieces or putting them back together again. In other words, nk represents the work you do
independent of the recursive calls you make. Note this corresponds to one row in our pictorial
representation.

Let's use the theorem to revisit the same problems we solved pictorially.

Example 3
procedure B(N)
for i = 1, N
x = x + i
Call B(N/2)
Call B(N/2)
end procedure

a=2 b=2 k=1 (as work is n)

Since a=bk, we are in the ``equals'' case.

if a= bk the complexity is O(nk log n) = O( n log g) which is exactly what our pictures told us.

Example 4
procedure B(N)
for i = 1, N
x = x + i
Call B(N/2)
end procedure

a=1 b=2 k=1

Since a < bk we are in the ``less than'' case.

if a < bk the complexity is O(nk) = O(n) which is exactly what our pictures told us.

Example 5
procedure B(N)
x = x + i
Call B(N/2)
end procedure
a=1b=2 k=0 as the work is 1 or n0

a= bk as 1 = 20 we are in the ``equals'' case.

if a= bk the complexity is O(nk log n) = O(log n) which is exaclty what our pictures told us.

Example 6
procedure B(N)
x = x + i
Call B(N/2)
Call B(N/2)
end procedure

a= 2 b = 2 k = 0

Since a > bk we are in the ``greater than'' case.

if a > bk the complexity is O(n logb a) = O(n log2 2) = O(n1) which is exactly what our pictures told us.

Recurrence Relations
In analyzing algorithms, we often encounter progressions. Most calculus books list the formulas below,
but you should be able to derive them.

Arithmetic Progression
Example:
Writing the same sum backwards:

If we add the two S's together,

Geometric Progression
Example:
Multiplying both sides by the base:

Subtracting S from 2S we get


Mathematically Solving Recurrence Relations
While the math types may like this approach, it usually confuses the non-math types more than helps.

Example 7

Suppose our recurrence relation is:

and

We use the telescoping technique (plug and chug technique). (You may have learned something you
like better in your discrete math classes.) We set up a series of equations. Since we would never stop
for infinite n, let n = 32 to see the pattern, and generalize for any n.

Now by substituting each equation into the previous equation we have

Notice the upper limit on the summation is related to the size of n. Since 24 = n/2 in this case, we
conjecture the upper bound (call it b) will always be determined by 2b = n/2, i.e. b = log2n -1 . We
would actually have to consider a few more cases to convince ourselves this is the actual relationship.
Notice, however, that we expect to perform the recursion a logarithmic number of times as we halved
the problem size each time.
using our geometric sum technique

Thus,

It would be nice to be able to check our work: For the case of n = 32, we got

Since 3(32)/2 - 2 = 46, we are reasonably certain we did the computations correctly.

Telescoping, again : If that was confusing, you could also have done the same thing as follows:

But since n=32

Thus,

Example 8

Suppose the relation was

and

This is a common form. You break a problem of size n into two equal sized pieces (n/2) and have to
solve each piece (as long as n/2 is bigger than our base case.). The breaking apart of the problem and
the accumulating the results takes time proportional to n.
This technique won't always work, but for the type of recurrence relations we see in algorithm analysis,
it is often useful.

Divide each side of the recurrence relation by n. The choice will become clearer as we proceed.

Tn = 2Tn/2 +n

This equation is valid for any power of two, so:

and

Now add up all of the equations. This means we add up all the terms on the left hand side and set the
result to the sum of all the terms on the right hand size. Observe, virtually all terms exist on both sides
of the equation and may be cancelled. After everything is cancelled, we get

Example 9

Consider a sorting algorithm in which you repeatedly search the unsorted portion of an array for the
smallest element. The smallest element is swapped with the first element in the unsorted portion, and
the lower limit of the unsorted portion is incremented. If we have n elements, it takes n-1 compares to
find the smallest, 1 swap, and then sort the remaining n-1 elements.

If there are only two elements, there is a compare and at most 1 swap.

To see the pattern, let's assume n=6


Backwards substitution yields: In general,

You might also like