Unit-1 Introduction
Unit-1 Introduction
Unit-1 Introduction
Chapter-I: Introduction
Text Books:
1.Ellis Horowitz, SartajSahni, Sanguthevar Rajasekaran, “Fundamentals of
Computer Algorithms”, 2nd Edition, University Press.
2.Introduction to Algorithms Thomas H. Cormen, PHI Learning
UNIT I: Introduction to
Problem Solving Concepts
1. What is an Algorithm?
2. Algorithm specification
3. Performance analysis
4. Performance Measurement (Time and Space
Complexity)
5. Amortized complexity
6. Asymptotic notation
7. Practical Complexities
What is an Algorithm?
• Understand speech
• Translate to Chinese
Algorithm Specification
Space complexity
How much space is required
Time complexity
How much time does it take to run the algorithm
Often, we deal with estimates!
Space Complexity
100
Running Time
80
60
40
20
0
1000 2000 3000 4000
Input Size
Use a Theoretical Approach
Based on high-level description of the algorithms, rather
than language dependent implementations
Makes possible an evaluation of the algorithms that is
independent of the hardware and software environments
Pseudo-Code = a description of an algorithm that is
more structured than usual prose but
less formal than a programming language
Example: find the maximum element of an array.
Algorithm arrayMax(A, n):
Input: An array A storing n integers.
Output: The maximum element in A.
currentMax A[0]
for i 1 to n -1 do
if currentMax < A[i] then currentMax A[i]
return currentMax
Counting Primitive Operations
Example: find the maximum element of an array.
Algorithm arrayMax(A, n) No. of Operations
//Input: An array A storing n integers. 00
//Output: The maximum element in A. 00
currentMax A[0] 01 time
for i 1 to n -1 do n times
if currentMax < A[i] then (n-1) times
currentMax A[i] (n-1) times
return currentMax 01 time
Total: 3n times
Estimating Running Time
Q(g(n)) =
f(n) : positive constants c1, c2, and
n0, such that n n0,
we have 0 c1g(n) f(n) c2g(n)
Example:
10n2 - 3n = Θ(n2)
What constants for n0, c1, and c2 will work?
Make c1 a little smaller than the leading coefficient, and c2 a
little bigger.
To compare orders of growth, look at the leading term.
Solution: c1n2 ≤ 10n2 −3n ≤ c2n2
c1 ≤ 10−3/n ≤ c2
For n0=1, c1 ≤ 7 and c2 ≥ 10
c1n2 ≤ n2/2−3n ≤ c2n2
Exercise: Prove that n2/2-3n= Θ(n2) c1 ≤ 1/2−3/n ≤ c2
For n0=7,
c1 ≤ 1/14 and c2 ≥ 1/2
Big-Oh (O) - notation
For a function having only
asymptotic upper bound, Big Oh
„O‟ notation is used. Let a given
function g(n), O(g(n))) is the set
of functions f(n) defined as
O(g(n))={f(n): if there exist
positive constant c and n0 such
that 0≤f(n) ≤cg(n) for all n, n n0}
10n2 + 4n + 2 ≤ C*n2
10n2 + 4n + 2 ≤ 10n2 + n2 ≤ 11n2
For n=1, 16 ≤11 not true
For n=5, 272 ≤ 275 true
So c = 11 ∀n ≥ 5, O(n2 )
6*2n + n2 ≤ 6*2n + 2n
6*2n + n2 ≤ 7*2n
6*2n + n2 ≤ C*2n
For n=1, 13 ≤ 14 not true
For n=2, 28 ≤ 28 true
So C=7 and ∀n ≥ 2, O(2n)
Big Omega (Ω) - Notation
For a function having only
asymptotic lower bound, Ω
notation is used. Let a given
function g(n), Ω(g(n))) is the set
of functions f(n) defined as
Ω(g(n)) ={f(n): if there exist
positive constant c and n0 such
that 0≤ cg(n) ≤f(n) for all n, n n0}
g(n) is an asymptotic lower bound for f(n).
f(n)=3n2+n
f(n)>=c*g(n) => 3n2+n>=3n2= Ω(n2)
f(n) = o(g(n))
If f(n) = n2 and g(n) = n3 then
check whether f(n) = o(g(n))
or not.
Definition (Little–Omega, ω()): Let f(n) and g(n) be functions that
map positive integers to positive real numbers. We say that f(n) is
ω(g(n)) (or f(n) ∈ ω(g(n))) if for any real constant c > 0, there exists an
integer constant n0 ≥ 1 such that f(n)>c*g(n) for every integer n ≥ n0.
Examples
•2≠ω(1)
•4x+2≠ω(x)
•4x+2=ω(1)
•3x2+4x+2≠ω(x2)
•3x2+4x+2=ω(x)
Amortized complexity
• Amortized Analysis is used for algorithms where an occasional
operation is very slow, but most of the other operations are faster.
• In Amortized Analysis, we analyze a sequence of operations and
guarantee a worst-case average time that is lower than the worst-
case time of a particularly expensive operation.
• The example data structures whose operations are analyzed using
Amortized Analysis are Hash Tables, Disjoint Sets, and Splay
Trees.
• Amortized analysis is a technique used in computer science to
analyze the average-case time complexity of algorithms that perform
a sequence of operations, where some operations may be more
expensive than others. The idea is to spread the cost of these
expensive operations over multiple operations, so that the average
cost of each operation is constant or less.
Amortized analysis is useful for designing efficient algorithms for data
structures such as dynamic arrays, priority queues, and disjoint-set data
structures. It provides a guarantee that the average-case time complexity
of an operation is constant, even if some operations may be expensive.
0 T(n-1) 0 T(n-1)
total T(n-1) + 2