Complexity Theory: Texts
Complexity Theory: Texts
Complexity Theory: Texts
Texts
Complexity Theory
The main texts for the course are:
Lecture 1
Computational Complexity.
Christos H. Papadimitriou.
Introduction to the Theory of Computation.
Michael Sipser.
Other useful references include:
Anuj Dawar
Computers and Intractability: A guide to the theory of
NP-completeness.
University of Cambridge Computer Laboratory
Michael R. Garey and David S. Johnson.
Easter Term 2011
Computational complexity: a conceptual perspective.
O. Goldreich.
http://www.cl.cam.ac.uk/teaching/1011/Complexity/
Computability and Complexity from a Programming Perspective.
Neil Jones.
A rough lecture-by-lecture guide, with relevant sections from the • Sets, numbers and scheduling. 9.4
text by Papadimitriou (or Sipser, where marked with an S).
• coNP. 10.1–10.2.
• Algorithms and problems. 1.1–1.3.
• Cryptographic complexity. 12.1–12.2.
• Time and space. 2.1–2.5, 2.7.
• Space Complexity 7.1, 7.3, S8.1.
• Time Complexity classes. 7.1, S7.2.
• Hierarchy 7.2, S9.1.
• Nondeterminism. 2.7, 9.1, S7.3.
• Descriptive Complexity 5.6, 5.7.
• NP-completeness. 8.1–8.2, 9.2.
• Graph-theoretic problems. 9.3
Lectures here on MWF 12:00 Complexity Theory seeks to understand what makes certain
problems algorithmically difficult to solve.
except:
• Monday, 9th May, at 9:00
In Algorithms I and II, we have seen how to measure the complexity
• Monday, 16th May, at 11:00 (swapping slots with Databases) of specific algorithms, by asymptotic measures of number of steps.
Insertion Sort runs in time O(n2 ), while Merge Sort is an What is the running time complexity of the fastest algorithm that
O(n log n) algorithm. sorts a list?
The first half of this statement is short for: By the analysis of the Merge Sort algorithm, we know that this is
no worse than O(n log n).
If we count the number of steps performed by the Insertion
Sort algorithm on an input of size n, taking the largest The complexity of a particular algorithm establishes an upper
such number, from among all inputs of that size, then the bound on the complexity of the problem.
function of n so defined is eventually bounded by a To establish a lower bound, we need to show that no possible
constant multiple of n2 . algorithm, including those as yet undreamed of, can do better.
In the case of sorting, we can establish a lower bound of Ω(n log n),
It makes sense to compare the two algorithms, because they seek to showing that Merge Sort is asymptotically optimal.
solve the same problem.
Sorting is a rare example where known upper and lower bounds
But, what is the complexity of the sorting problem? match.
The complexity of an algorithm (whether measuring number of An algorithm A sorting a list of n distinct numbers a1 , . . . , an .
steps, or amount of memory) is usually described asymptotically:
ai < aj ?
Definition
For functions f : IN → IN and g : IN → IN, we say that: a k < al ?
a p < aq ?
Given Obvious algorithm: Try all possible orderings of V and find the
one with lowest cost.
• V — a set of nodes.
The worst case running time is θ(n!).
• c : V × V → IN — a cost matrix.
Find an ordering v1 , . . . , vn of V for which the total cost: Lower bound: An analysis like that for sorting shows a lower
bound of Ω(n log n).
X
n−1
c(vn , v1 ) + c(vi , vi+1 )
i=1 Upper bound: The currently fastest known algorithm has a
running time of O(n2 2n ).
is the smallest possible.
Formalising Algorithms