CSE 221 Lec01 Intro F23
CSE 221 Lec01 Intro F23
CSE 221 Lec01 Intro F23
Analysis of
Algorithms
Fall 23
1
CSE 221: Design and Analysis of Algorithms
Instructor: Prof. Hala Zayed Instructor: Prof. Amr El-Masry
Professor of Computer
Science/Engineering, Faculty of
Computers
TA: Eng. Rameez
Hala.zayed@eui.edu.eg
Rameez.barakat@eui.edu.eg
Office hours: Office hours:
Sun. 01:00 - 02:00 pm Mon. 12:00 - 2:00 pm
Tues. 01:00 - 02:00 pm Wed. 12:00 - 2:00 pm
Thurs. 09:00 - 11:00 am Location: Third Floor,
Location: Second Floor, Room 345
Room 226
2
Course Contents
• Techniques for analyzing the time and space requirements of
algorithms, amortized analysis, randomization, fundamental design
strategies: divide-and-conquer, dynamic programming, and greedy
methods, introduction to the intractable (NP-hard) problems.
• Prerequisites: PHM 114, CSE 123
3
Mapping with Purdue
CS 38100: Introduction to the Analysis of CSE 221: Design and Analysis of Algorithms
Algorithms
4
CSE 221: Design and Analysis of Algorithms
Grading Policy:
• Assignments 10%
• Tutorial 5%
• Project 15%
• Quizzes 10%
• Midterm 20%
• Final 40%
(students with less than 40% in final exam will get an F in the course – You should
attend 75% of lectures and labs to enter the final exam)
• Bonus
• In lecture participation 3% bonus
5
CSE 221: Design and Analysis of Algorithms
Handouts:
• Lectures
• Tutorials
Textbooks
○ Foundations of Algorithms 5th Edition, Richard Neapolitan, Jones & Bartlett Learning,
2015 [FA]
○ Algorithms Illuminated, Omnibus Edition, by Tim Roughgarden Columbia University,
New York, January 2023 [AI]
○ Introduction to Algorithms 4th edition, Cormen, Leiserson, Rivest and Stein, The MIT
Press, 2022. [COR]
6
Course Contents
1.Algorithms: Efficiency, Analysis, and Order
2.Divide-and-Conquer
3.Recurrences - Master Theorem
4.Trees
5.Graphs
6.Greedy Algorithms
7.Dynamic Programming
8.Shortest path algorithms using DP
9.Amortized algorithms
10.Randomized algorithms
11.A brief introduction to NP-Hard Problems
7
Lec 1 - Algorithms:
Efficiency, Analysis,
and Order
8
Contents
• Algorithms
• The Importance of Developing Efficient Algorithms
• Analysis of Algorithms
• Complexity Analysis
• Analysis of Correctness
• Order
• Runtime Analysis
• Big O
• BIG-Ω
• BIG-Ө
9
Algorithms?
• A computer program is composed of individual modules,
understandable by a computer, that solve specific tasks (such as
sorting). The concentration here is not on the design of entire programs,
but rather on the design of the individual modules that accomplish the
specific tasks. These specific tasks are called problems.
• A problem is a question to which we seek an answer.
• To produce a computer program that can solve all instances of a
problem, we must specify a general step-by-step procedure for
producing the solution to each instance.
• An Algorithm is applying a technique to a problem that results in a step-
by-step procedure for solving the problem.
• This step-by-step procedure is called an algorithm. We say that the algorithm
solves the problem.
10
Analysis Aspects
• Correctness
• Does the input/output relation match algorithm requirement?
11
General approaches for algorithm design
12
The Importance of Developing Efficient
Algorithms
13
Sequential Search vs Binary Search
Find: 37
• Worst Case
• Sequential Search: n operations
• Binary Search: log n + 1 operations
14
Sequential Search vs Binary Search – Worst
Case
• The sequential search algorithm begins at the first position in the array and
looks at each value in turn until the item is found.
• The binary search algorithm first compares x with the middle item of the array. If
they are equal, the algorithm is done. If x is smaller than the middle item, then x
must be in the first half of the array (if it is present at all),and the algorithm
repeats the searching procedure on the first half of the array.(That is, x is
compared with the middle item of the first half of the array. If they are equal, the
algorithm is done, etc.) If x is larger than the middle item of the array, the search
is repeated on the second half of the array. This procedure is repeated until x is
found or it is determined that x is not in the array.
• Input sorted Array S of size n
• Sequential Search: n operations
• Binary Search: log n + 1 operations
15
Sequential Search vs Binary Search – Worst
Case
16
Binary Search
17
Fibonacci: Iterative vs Recursive
• Fib0 = 0
• Fib1 = 1
• Fibn = Fibn-1 + Fibn-2
18
Recursion Tree for the 5th Fibonacci Term
19
Fibonacci: Recursive
• Although the algorithm was easy to create and is understandable, it is extremely inefficient.
• If T(n) is the number of terms in the recursion tree corresponding to Algorithm 1.6, then, for n ≥2,
T(n) > 2n/2 (proof by induction pg. 15 (FA)
20
Fibonacci: Iterative
n 0 1 2 3 4 5 6 7 8 9 10 11 12 13
fib(n) 1 1 1 2 3 5 8 13 21 34 55 89 …. ..
21
Fibonacci: Iterative
22
Comparison of Recursive and Iterative
Solutions
• Calculate the nth Fibonacci Term:
• Recursive calculates > 2n/2 terms
• Iterative calculates more than n+1 terms
23
Comparison of Recursive and Iterative
Solutions
24
Quick Check
1. The step-by-step procedure to solve a problem is called
a(n) _____.
Answer: algorithm
2. A problem can be solved by only one algorithm (True or
False).
Answer: False
3. Sequential search appears to be much more efficient than
binary search (True or False)
Answer: False
25
Analysis of Algorithms
26
Complexity Analysis
• Define Basic Operation
• Count the number of times the basic operation executes for
each value of the input size
• Maybe:
• dependent on input value (e.g, sequential search)
• Not dependent on input value (every-case time complexity analysis)
27
g(n): every-case time complexity
• time complexity analysis of an algorithm determines how many
times the basic operation is done for each value of the input size.
• In some cases, the number of times it is done depends not only on
the input size, but also on the input’s values. E.g, Sequential Search
• In other cases, such as Algorithm 1.2 (Add Array Members), the
basic operation is always done the same number of times for every
instance of size n.
• When this is the case, g(n) is defined as the number of times the
algorithm does the basic operation for an instance of size n.
• g(n) is called the every-case time complexity of the algorithm, and
the determination of g(n) is called an every-case time complexity
analysis.
28
g(n): every-case time complexity
• Add array member: g(n) = n
• Exchange Sort: g(n) = (n-1)+(n-2)+(n-3)+ …… +1 = ((n-1)n)/2
29
W(n): Worst-Case Time Complexity
• Some algorithms do not have an every-case time complexity (g(n)).
• There are three other analysis techniques that can be tried.
• W(n) the worst-case time complexity of the algorithm is defined as the
maximum number of times the algorithm will ever do its basic operation for an
input size of n.
• Worst-Case Time Complexity (Sequential Search): W(n) = n
• A(n) the average-case time complexity of the algorithm is defined as the
average (expected value) of the number of times the algorithm does the basic
operation for an input size of n.
• Average-case time complexity (Sequential Search): A(n) = (n+1)/2
• B(n) the best-case time complexity of the algorithm is defined as the minimum
number of times the algorithm will ever do its basic operation for an input size
of n.
• Best-Case Time Complexity (Sequential Search) B(n) = 1
31
Complexity Analysis – Large n
• Worst Case W(n)
• Every Case g(n)
• Average Case A(n)
• Best Case B(n)
32
Applying the Theory of Algorithm Analysis
• When applying the theory of algorithm analysis, three important factors should
be taken into account:
• the time it takes to execute the basic operation
• the overhead instructions
• the control instructions on the actual computer on which the algorithm is implemented.
• Overhead instructions mean instructions such as initialization instructions
before a loop. The number of times these instructions execute does not
increase with input size.
• Control instructions mean instructions such as incrementing an index to
control a loop. The number of times these instructions execute increases with
input size.
• The basic operation, overhead instructions, and control instructions are all
properties of an algorithm and the implementation of the algorithm.
33
Applying the Theory of Algorithm Analysis
• Suppose we have two algorithms for the same problem:
• every-case time complexity = n for the 1st algorithm
• every-case time complexity = n2 for the 2nd algorithm
• Suppose, however, a given computer takes 1,000 times as long to
process the basic operation once in the 1st algorithm as it takes
to process the basic operation once in the 2nd algorithm.
• For simplicity, let's assume that the time it takes to execute the
overhead instructions is negligible in both algorithms.
• the first algorithm is more efficient if:
• n x 1,000 t < n2 x t or n2 x t > n x 1,000 t
• n > 1000
34
Analysis of Correctness
35
Quick Check
1. Algorithm analysis measures the efficiency of an algorithm as the _____ becomes
large.
Answer: input size
2. _____ is defined as the number of times the algorithm does the basic operation for
an instance of size n.
Answer: g(n)
3. W(n) is defined as the minimum number of times the algorithm will ever do its basic
operation for an input size of n (True or False).
Answer: False
4. A(n) is called the average-case time complexity of the algorithm (True or False).
Answer: True
5. Analyzing the _____ of an algorithm is developing a proof that the algorithm actually
does what it is supposed to do.
Answer: correctness
36
Introduction to Order
• time complexities are:
• 100n for the 1st algorithm
• 0.01n2 for the second algorithm
• the 1st algorithm (order n) will eventually be more efficient than the second one (order n2)
if:
• 100n < 0.01n2 n > 10000
• Algorithms with time complexities such as n and 100n are called linear-time algorithms.
• algorithms with time complexities such as n2 and 0.01n2 are called quadratic-time
algorithms
37
Introduction to Order
38
Introduction to Order
39
A NOTE ON RUNTIME ANALYSIS
40
BIG-O NOTATION
42
BIG-O NOTATION
43
Big O
• g(n) ≤ c × f(n)
• g(n) ∈ O(f(n))
• Examples:
• n grows more slowly than n3 => n ∈ O(n3 )
n 2 n2
• n grows more slowly than ----- => n ∈ O(----- )
• n grows at the rate of n n
----- => n ∈ O(----- )
44
PROVING BIG-O BOUNDS: EXAMPLE
find values for c & N such that for all n ≥ N: Let’s choose:
3n2 + 5n ≤ c • n2 c=4
N= 5
rearrange this inequality just to see things a bit more clearly:
(other choices work
5n ≤ (c – 3) • n 2
too!
Now let’s cancel out the n: e.g. c= 10, N = 10)
5 ≤ (c – 3) n
45
DISPROVING BIG-O BOUNDS
• If you’re ever asked to formally disprove that g(n) is O(f(n)),
use proof by contradiction!
46
DISPROVING BIG-O: EXAMPLE
Prove that 3n2 + 5n is not O(n).
For sake of contradiction, assume that 3n2 + 5n is O(n). This means that there exists
positive constants c & N such that 3n2 + 5n ≤ c ・ n for all n ≥ N. Then, we would have
the following:
3n2 + 5n ≤ c ・ n
3n + 5 ≤ c
n ≤ (c - 5)/3
However, since (c - 5)/3 is a constant, we’ve arrived at a contradiction since n cannot be
bounded above by a constant for all n ≥ N. Thus, our original assumption was incorrect,
which means that 3n2 + 5n is not O(n).
47
BIG-O EXAMPLES
Polynomials
Say p(n) = akn + ak-1n + ··· + a1n + a0 is a
k k-1 6n3 + n log2n= O(n3)
polynomial of degree k ≥ 1.
Then:
i. p(n) = O(nk) 25 = O(1)
ii. p(n) is not O(nk-1)
[any constant] = O(1)
48
BIG-O EXAMPLES
49
BIG-Ω NOTATION
g(n) ∈ Ω(f(n))
50
Big Ω (Omega)
• g(n) ≥ c × f(n).
• g(n) ∈ Ω(f(n))
• Examples:
• 5n2 ε Ω(n2)
• n2 + 10 n ε Ω(n2)
• n3 ε Ω(n2)
51
Examples
• Show that 5n2 ε Ω(n2)
• 5n2 ≥ 1xn2
• we can take c = 1, N = 0
• Show that n2 + 10 n ε Ω(n2)
• n2 + 10 n ≥ n2
• We can take c = 1
• N=0
• Show that n3 ε Ω(n2)
• n3 ≥ 1 x n2
• We can take c = 1
• N=1
52
BIG-Ө (Theta)
• For a given complexity function f(n),
• θ(f(n)) = O(f(n)) ∩ Ω(f(n))
• This means that θ(f(n)) is the set of complexity functions g
(n) for which there exists some positive real constants c and
d and some nonnegative integer N such that, for all n ≥ N,
• c × f(n) ≤ g(n) ≤ d × f(n).
• g(n) ∈ θ(f(n))
• If g(n) ∈ θ(f(n)), we say that g(n) is order of f(n).
53
BIG-Ө NOTATION
54
BIG-Ө NOTATION
Example: We show that ½ n (n-1) ϵ θ(n2).
We need to prove that
c f(n) ≤ g(n) ≤ d f(n) for all n ≥ no
d= ½ , n ≥ 0
55
Big O, Omega, Theta
56
Asymptotic Notation Cheat Sheet
DEFINITION (HOW TO WHAT IT
BOUND
PROVE) REPRESENTS
57
Examples
• The sets O(n2 ), Ω(n2) and Θ(n2)
• Some exemplary members are shown.
g(n)
g(n) g(n)
f(n)
58
Some common Big-O run times
59
Growth Rates of Common Functions
60
Growth Rates of Common Functions
61
Growth Rates of Common Functions
62
Quick Check
1. Algorithms with time complexities such as n and 100n are called _____ algorithms.
Answer: linear-time
2. Algorithms with time complexities such as n^2 are called quadratic-time algorithms
(True or False).
Answer: True
3. Any quadratic-time algorithm is eventually more efficient than any linear-time
algorithm (True or False).
Answer: False
4. Functions such as 5n^2 and 5n^2 +100 are called _____ functions.
Answer: pure quadratic
5. _____assumes something is true, and then does manipulations that lead to a
result that is not true..
Answer: Proof by contradiction
63
Properties of Order
1. g(n) ∈ O(f(n)) if and only if f(n) ∈ Ω (g(n))
2. g(n) ∈ Θ (f(n)) if and only if f(n) ∈ Θ (g(n))
3. If b > 1 and a > 1, then loga n ∈ Θ (logb n)
• This implies that all logarithmic complexity functions are in the same
complexity category. We will represent this category by Θ(lg n)
4. If b > a > 0, then an ∈ O(bn)
• This implies that all exponential complexity functions are not in the
same complexity category.
5. For all a > 0 an ∈ O(n!)
• This implies that n! is worse than any exponential complexity function.
64
Properties of Order
6. Consider the following ordering of complexity categories:
65
References
• Foundations of Algorithms, ch 1.
• design and analysis of algorithms Stanford university
66
Questions
67