Algorithm Complexity L3
Algorithm Complexity L3
Recursion
A subroutine which calls itself, with different parameters.
Need to evaluate factorial(n)
factorial(n) = n.(n-1)2.1
= nfactorial(n-1)
Suppose routine factorial(p) can find factorial of p for all p <
m. Then factorial(m+1) can be solved as follows:
factorial(m+1) =(m+1)factorial(m)
Anything missing?
Factorial(m)
{
If m = 1, Factorial(m) = 1;
Else Factorial(m) = mFactorial(m-1)
}
Rules:
Have a base case for which the subroutine need not call
itself.
For general case, the subroutine does some operations, calls
itself, gets result and does some operations with the result
To get result, subroutine should progressively move towards
the base case.
Towers of Hanoi
Source peg, Destination peg, Auxiliary peg
k disks on the source peg, a bigger disk can never
be on top of a smaller disk
Need to move all k disks to the destination peg
using the auxiliary peg, without ever keeping a
bigger disk on the smaller disk.
Why can this be solved by recursion?
Efficient Algorithms
A city has n view points
Buses ply from one view point to another
A bus driver wishes to follow the shortest path
(travel time wise).
Every view point is connected to another by a road.
However, some roads are less congested than others.
Also, roads are one-way, i.e., the road from view point 1
to 2, is different from that from view point 2 to 1.
(n/e)n
What is efficiency of an
algorithm?
V = a,
W= b
While W > 1
V V + a; W W-1
Output V
Order of Increase
We worry about the speed of our algorithms for large input
sizes.
Note that for large n, log(n)/n , and n/exp(n) are very small.
However, n/2n is a constant for all n.
exp (n)
n
log n
n
Function Orders
A function f(n) is O(g(n)) if ``increase of f(n) is not
faster than that of g(n).
A function f(n) is O(g(n)) if there exists a number n0 and
a nonnegative c such that for all n n0 , 0 f(n)
cg(n).
If limnf(n)/g(n) exists and is finite, then f(n) is
O(g(n))
Example Functions
sqrt(n) , n, 2n, ln n, exp(n), n + sqrt(n) , n + n2
limn sqrt(n) /n = 0,
sqrt(n) is O(n)
n is not O(sqrt(n))
n is O(2n)
2n is O(n)
limn ln(n) /n = 0,
ln(n) is O(n)
n is not O(ln(n))
limn n/exp(n) = 0,
n is O(exp(n))
limn (n+sqrt(n)) /n = 1,
n + sqrt(n) is O(n)
limn n/(sqrt(n)+n) = 1,
n is O(n+sqrt(n))
n + n2 is not O(n)
n is O(n + n2 )
Implication of O notation
Suppose we know that our algorithm uses at
most O(f(n)) basic steps for any n inputs, and n
is sufficiently large, then we know that our
algorithm will terminate after executing at most
constant times f(n) basic steps.
We know that a basic step takes a constant
time in a machine.
Hence, our algorithm will terminate in a constant
times f(n) units of time, for all large n.
Example Functions
sqrt(n) , n, 2n, ln n, exp(n), n + sqrt(n) , n + n2
limn sqrt(n) /n = 0,
sqrt(n) is o(n)
n is (sqrt(n))
n is (2n), (2n)
2n is (n), (n)
limn ln(n) /n = 0,
ln(n) is o(n)
n is (ln(n))
exp(n) is (n)
limn n/exp(n) = 0,
n is o(exp(n))
limn (n+sqrt(n)) /n = 1,
n + sqrt(n) is (n),(n)
limn n/(sqrt(n)+n) = 1,
n is (n+sqrt(n)),
(n+sqrt(n)),
n + n2 is (n)
n is o(n + n2 )
Complexity of a Problem Vs
Algorithm
A problem is O(f(n)) means there is some O(f(n))
algorithm to solve the problem.
Reading Assignment
Section 1.3
Section 2.1