Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
31 views

Algorithmic Complexity

This document discusses algorithmic complexity and how to measure the computational complexity of algorithms. It introduces big-O notation to describe how the running time of an algorithm scales with the size of the input. Specifically, it defines big-O, Ω, and Θ notations and provides examples of classifying functions using these notations. It explains that big-O gives an upper bound on the growth rate of a function, Ω gives a lower bound, and Θ indicates both an upper and lower bound. The document aims to explain how to analyze the complexity of algorithms independently of machine-specific factors.

Uploaded by

nguyenhau51496
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
31 views

Algorithmic Complexity

This document discusses algorithmic complexity and how to measure the computational complexity of algorithms. It introduces big-O notation to describe how the running time of an algorithm scales with the size of the input. Specifically, it defines big-O, Ω, and Θ notations and provides examples of classifying functions using these notations. It explains that big-O gives an upper bound on the growth rate of a function, Ω gives a lower bound, and Θ indicates both an upper and lower bound. The document aims to explain how to analyze the complexity of algorithms independently of machine-specific factors.

Uploaded by

nguyenhau51496
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 20

DATA STRUCTURE AND

ALGORITHMS
Algorithmic Complexity
Overview

1
Computational complexity
• How much time will it take a program to run?
• How much memory will it need to run?
• Need to balance minimizing computational complexity with
conceptual complexity

2
Measuring complexity
• Goals in designing programs
1. It returns the correct answer on all legal inputs
2. It performs the computation efficiently

• Typically (1) is most important, but sometimes (2) is also critical


• Even when (1) is most important, it is valuable to understand and
optimize (2)

3
How do we measure complexity?
• Given a function, would like to answer: “How long will this take to
run?”
• Problem is that this depends on:
1. Speed of compute.
2. Specifics of Programming Language implementation
3. Value of input
• Avoid (1) and (2) by measuring time in terms of number of basic steps
executed
• For point (3), measure time in terms of size of input

4
Cases for measuring complexity
• Best case: minimum running time over all possible inputs of a given
size
• For linearSearch – constant, i.e. independent of size of inputs
• Worst case: maximum running time over all possible inputs of a given
size
• For linearSearch – linear in size of list
• Average (or expected) case: average running time over all possible
inputs of a given size
• We will focus on worst case – a kind of upper bound on running time

5
Example
def fact(n): • Number of steps:
answer = 1 - 1 (for assignment)
while n > 0:
answer *= n - 5*n (1 for test, plus 2 for first
n -= 1 assignment, plus 2 for second
return answer assignment in while; repeated n
times through while)
- 1 (for return)
• 5*n+2steps
• But as n gets large, 2 is
irrelevant, so basically 5*n steps

6
Big-Oh notation!
• Gives us a meaningful way to talk about the running time of an
algorithm, independent of programming language, computing
platform, etc., without having to count all the operations.
• Focus on how the runtime scales with n (the input size).

7
Big-Oh Example

8
Why is this a good idea?
• Suppose the running time of an algorithm is:

9
Formal definition of O(…)
• Let 𝑇(𝑛) , 𝑔(𝑛) be functions of positive integers.
• Think of 𝑇(𝑛) as a runtime: positive and increasing in n.

• Formally,

10
Example
• 2𝑛2 + 10 = 𝑂(𝑛2 )

11
Example
• 2𝑛2 + 10 = 𝑂(𝑛2 )

12
Example
• 2𝑛2 + 10 = 𝑂(𝑛2 ) Choose c = 3
Choose 𝑛0 = 4
Then:
∀𝑛 ≥ 4, 2𝑛2 + 10 ≤ 3𝑛2

13
Growth of functions commonly used in big-O

14
Exercise
• Determine whether each of these functions is O(x).
a) 𝑓 𝑥 = 10
b) 𝑓 𝑥 = 3𝑥 + 7
c) 𝑓 𝑥 = 𝑥 2 + 𝑥 + 1
d) 𝑓 𝑥 = 5log(𝑥)

15
Ω(…) means a lower bound
• Let 𝑇(𝑛) , 𝑔(𝑛) be functions of positive integers.
• Think of 𝑇(𝑛) as a runtime: positive and increasing in n.

• Formally,

16
Example
• 𝑛𝑙𝑜𝑔2 𝑛 = Ω(3𝑛)

• Choose c = 1/3
• Choose 𝑛0 = 2
3𝑛
• Then ∀𝑛 ≥ 2, ≤ 𝑛𝑙𝑜𝑔2 (𝑛)
3

17
Θ(…) means both!
• 𝑇 𝑛 𝑖𝑠 Θ 𝑔 𝑛 iff both:

𝑇 𝑛 =𝑂 𝑔 𝑛
and
𝑇 𝑛 =Ω 𝑔 𝑛

18
Exercise: Show that
• 3𝑥 2 + 8𝑥𝑙𝑜𝑔𝑥 = Θ(𝑥 2 )

19
Exercise
• Arrange the functions 𝑙𝑜𝑔𝑛, n, n𝑙𝑜𝑔𝑛, 𝑛𝑚 , 𝑚 > 1,2𝑛 ,𝑛! in a list so
that each function is big-O of the next function.

20

You might also like