Whereas traditional complexity theory classifies problems into polynomial time solvable (easy) or NP-hard, a more modern complexity theory aims to achieve a more fine-grained classification. For example, we'd like to know if the time complexity is near-linear or quadratic? This is motivated by the fact that, with the growing sizes of data, even quadratic time can be impractical.
This course will present the current approach for obtaining fine-grained complexity results: we start with a small set of conjectures (similar to P \neq NP) about the exact complexity of certain core problems and devise a host of combinatorial reductions to achieve fine-grained lower bounds for many other problems.
The course will cover:
- The main conjectured-to-be-hard problems, what we know about them algorithmically, and how to reduce them to other problems. These problems include: k-SAT, 3-SUM, All-Pairs Shortest Paths, and k-Clique.
- Various examples of fine-grained results for important problems on strings (e.g. Edit-Distance), graphs (e.g. Diameter), and geometric data (e.g. Closest Pair).
- Other topics such as hardness of approximation, parameterized complexity, and barriers for reductions.