Etscheid, Michael: Beyond Worst-Case Analysis of Max-Cut and Local Search. - Bonn, 2018. - Dissertation, Rheinische Friedrich-Wilhelms-Universität Bonn.
Online-Ausgabe in bonndoc: https://nbn-resolving.org/urn:nbn:de:hbz:5n-51585
@phdthesis{handle:20.500.11811/7613,
urn: https://nbn-resolving.org/urn:nbn:de:hbz:5n-51585,
author = {{Michael Etscheid}},
title = {Beyond Worst-Case Analysis of Max-Cut and Local Search},
school = {Rheinische Friedrich-Wilhelms-Universität Bonn},
year = 2018,
month = aug,

note = {Since the formal definition of NP-completeness, there has been a huge discrepancy between theory and practice. The theoretical point of view for an NP-complete problem is that we do not know of any efficient, that is, polynomial-time algorithm for this problem. On the other hand, many of these problems are very well tractable in a practical sense. For any such problem there is a family of instances on which every known algorithm takes exponential time and is therefore not suitable for larger instances. But these instance families are often very contrived and artificial. The real-world instances, which we would actually like to solve, often tend to be much easie than what the worst-case running times may suggest. This motivates the research in other settings than the worst-case scenario. We will focus in this work on two popular analysis techniques: smoothed analysis and kernelization.
In a smoothed analysis a small amount of random noise is added to the instances before analyzing the expected running time of an algorithm. We use this framework to analyze local search, a technique that is perhaps the most famous example for which worst-case analysis fails to make reasonable running time predictions. Despite its exponential worst-case running time, we show that local search for the Maximum-Cut problem terminates after a quasi-polynomial number of steps in the smoothed setting. If we consider instances in which the nodes are points in a d-dimensional space and the edge weights are given by the squared Euclidean distances between these points, the smoothed running time is even polynomial in n and 2d.
Furthermore, we analyze local search for a standard scheduling problem in which jobs with different processing requirements are assigned to related machines. Local search corresponds to the best response dynamics that arises when jobs selfishly try to minimize their costs. We assume that each machine runs a coordination mechanism that determines the order of execution of jobs assigned to it. We obtain various new polynomial and pseudo-polynomial bounds for the convergence time of local search with respect to the coordination mechanisms Makespan, FIFO, and Shortest-Job-First. The pseudo-polynomial bounds can almost all be translated to smoothed polynomial running times.
In the second part of this thesis we will focus on kernelization. This technique is a formalization of efficient preprocessing for NP-hard problems using the framework of parameterized complexity. The first application is again the Maximum-Cut problem and some generalizations of it. Several of these cut problems are known to admit polynomial kernels when parameterized above the tight lower bound measured by the size and order of the graph. We continue this line of research and show how to find kernels with only O(k) vertices in O(km) time, where k is our parameter.
Finally we use an algorithm for compressing numbers due to Frank and Tardos (Cominatorica 1987) to derive polynomial kernels for weighted versions of several well-studied parameterized problems like d-Hitting Set and d-Set Packing. It is also useful to obtain kernels for various formulations of Subset Sum and Knapsack as well as for polynomial integer programs.},

url = {https://hdl.handle.net/20.500.11811/7613}
}

The following license files are associated with this item:

InCopyright