Big o
Big o
Big o
CS 260 1
Sums - review
• Some quick identities (just like integrals):
∑ cf (i ) = c ∑ f (i ), where c is constant
i i
∑ ( f (i ) + g (i ) ) = ∑ f (i ) + ∑ g ( i )
• Not like the integral:
a b
∑ f (i) = ∑ f (i)
i =b i =a
CS 260 2
Closed form for simple sums
• A couple easy ones you really should stick in
your head:
i=a
CS 260 3
Example (motivation), sect 1.2
CS 260 4
Eiffel Tower, alg. 1
• 1st attempt:
– Jack takes the paper and pencil, walks stairs
– Makes mark for each step
– Returns, shows Jill
• Runtime:
– # steps Jack took: 2n
– # marks: n
– So,
T1(n) = 2n + n = 3n
CS 260 5
Eiffel Tower, alg. 2
• 2nd plan:
– Jill doesn’t trust Jack, keeps paper and pencil
– Jack descends first step
– Marks step w/hat
– Returns to Jill, tells her to make a mark
– Finds hat, moves it down a step
– etc.
CS 260 6
Eiffel Tower, alg. 2 (cont.)
• Analysis:
– # marks = n
– # steps = 2 ( 1+2+3+…+n )
= 2 * n(n+1)/2
= n2 + n
– So,
T2(n) = n2 + 2n
CS 260 7
Eiffel Tower, alg. 3
• 3rd try:
– Jack & Jill see their friend Steve on the
ground
– Steve points to a sign
– Jill copies the number down
• Runtime:
– # steps Jack took: 0
– # marks: log10( n )
– So,
T3(n) = log10( n )
CS 260 8
Comparison of algorithms
• Cost of algorithm
(T(n)) vs. number
of inputs, n, (the
number of stairs).
CS 260 9
Wrap-up
• T1(n) is a line
– So, if we double the # of stairs, the runtime
doubles
• T2(n) is a parabola
– So, if we double the # of stairs, the runtime
quadruples (roughly)
• T3(n) is a logarithm
– We’d have to multiply the number of stairs by
a factor of 10 to increase T by 1 (roughly)
– Very nice function
CS 260 10
Some “what ifs”
• Suppose 3 stairs (or 3 marks) can be made per 1
second
• (There are really 2689 steps)
n 2n 3n
CS 260 11
More observations
• While the relative times for a given n are a
little sobering, what is of larger importance
(when evaluating algorithms) is how each
function grows
– T3 apparently doesn’t grow, or grows
very slowly
– T1 grows linearly
– T2 grows quadratically
CS 260 12
Asymptotic behavior
• When evaluating algorithms (from a
design point of view) we don’t want to
concern ourselves with lower-level details:
– processor speed
– the presence of a floating-point coprocessor
– the phase of the moon
• We are concerned simply with how a
function grows as n grows arbitrarily large
• I.e., we are interested in its asymptotic
behavior
CS 260 13
Asymptotic behavior (cont.)
• As n gets large, the function is dominated
more and more by its highest-order term
(so we don’t really need to consider lower-
order terms)
• The coefficient of the leading term is not
really of interest either. A line w/a steep
slope will eventually be overtaken by even
the laziest of parabolas (concave up).
That coefficient is just a matter of scale.
Irrelevant as n→∞
CS 260 14
Big-O, Θ
• A formal way to present the ideas of the
previous slide:
T(n) = O( f(n) ) iff there exist constants
k, n0 such that:
k*f(n) > T(n)
for all n>n0
• In other words, T(n) is bound above by
f(n). I.e., f(n) gets on top of T(n) at some
point, and stays there as n→∞
• So, T(n) grows no faster than f(n)
CS 260 15
Θ
• Further, if f(n) = O( T(n) ), then T(n) grows
no slower than f(n)
• We can then say that T(n) = Θ(n)
• I.e., T(n) can be bound both above and
below with f(n)
CS 260 16
Setup
• First we have to decide what it is we’re
counting, what might vary over the
algorithm, and what actions are constant
• Consider:
for( i=0; i<5; ++i )
++cnt;
– i=0 happens exactly once
– ++i happens 5 times
– i<5 happens 6 times
– ++cnt happens 5 times
CS 260 17
• If i and cnt are ints, then assignment,
addition, and comparison is constant
(exactly 32 bits are examined)
• So, i=0, i<5, and ++i each take some
constant time (though probably different)
• We may, for purposes of asymptotic
analysis, ignore the overhead:
–the single i=0
–the extra i<5
and consider the cost of executing the loop a
single time
CS 260 18
Setup (cont.)
CS 260 19
Setup (cont.)
• So, the total cost of executing that loop
can be given by:
4
T (n ) = ∑ c = 5c
i=0
• Constant time
• Makes sense. Loop runs a set number of
times, and each loop has a constant cost
CS 260 20
Setup (cont.)
CS 260 21
Eg 2
• Consider this loop:
for( i=0; i<n; ++i )
++cnt;
• Almost just like last one:
n −1
T (n ) = ∑ c = cn
i =0
CS 260 22
Eg 2 (cont.)
• Now, T(n) is linear in n
• T(n) = O(n)
• This means we can multiply n by
something to get bigger than T(n):
__ n > cn , let k = 2c
2cn > cn
• This isn’t true everywhere. We want to
know that it becomes true somewhere
and stays true as n→∞
CS 260 23
Eg. 2 (cont.)
• Solve the inequality:
2cn > cn
cn > 0
n > 0 (c is strictly
positive, so, not 0)
• cn gets above T(n) at
n=0 and stays there
as n gets large
• So, T(n) grows no
faster than a line
CS 260 24
Eg. 3
• Let’s make the previous example a little
more interesting:
• Say T(n) = 2cn + 13c
• T(n) = O(n)
• So, find some k such that
kn > 2cn + 13c (past some n0)
• Let k = 3c. =>
3cn > 2cn + 13c
CS 260 25
• Find values of n for Eg. 3 (cont.)
which the inequality is
true:
3cn > 2cn + 13c
cn > 13c
n > 13
• 3cn gets above T(n)
at n=13, and stays
there.
• T(n) grows no faster
than a line
CS 260 26
Eg 4
• Nested loops:
for( i=0; i<n; ++i )
for( j=1; j<=n; ++j )
++cnt;
• Runtime given by:
n −1 ⎛ n ⎞ n −1
T (n ) = ∑ ⎜ ∑ c ⎟ = ∑ cn = cn
⎜ ⎟ 2
i = 0 ⎝ j =1 ⎠ i =0
CS 260 27
Eg. 4 (cont.)
• Claim: T(n) = O(n2)
⇒there exists a constant k such that
kn2 > cn2 , let k = 2c:
2cn2 > cn2
• Where is this true?
cn2 > 0
n2 > 0
n>0
CS 260 28
Eg. 4 (cont.)
CS 260 29
Eg. 5
• Let’s say
T(n) = 2cn2 + 2cn + 3c
• Claim: T(n) = 0(n2)
• We just need to choose a k larger than the
coefficient of the leading term.
• Let k = 3c
⇒3cn2 > 2cn2 + 2cn + 3c
• Where? (Gather like terms, move to one
side)
cn2 - 2cn - 3c > 0
CS 260 30
Eg. 5 (cont.)
• This one could be solved directly, but it is
usually easier to find the roots of the
parabola on the left (which would be
where our original 2 parabolas intersected)
• This happens at n=-1 and n=3
• So, plug something like n=4 into our
original inequality. Is it true?
• Then it’s true for all n>3 (since we know
they don’t intersect again)
CS 260 31
Eg. 5 (cont.)
• 3cn2 gets above T(n)
at n=3 and stays
there
• T(n) grows no faster
than a parabola
• T(n) can be bound
above by a parabola
CS 260 32