We describe three results in this thesis. The first one is a heuristic improvement for a shortest path problem, which we termed single-source many-targets shortest path problem. In this problem, we need to compute a shortest path from a source node to a node that belongs to a designated target set. Dijkstra`s algorithm can be used to solve this problem. We are interested in the single-source many-targets shortest path problem since matching algorithms repeatedly solve this problem so as to compute a maximum weighted matching in a bipartite graph. The heuristic is easy to implement and, as our experiments show, considerably reduces the running time of the matching algorithm. We provide an average case analysis which shows that a substantial fraction of queue operations is saved by Dijkstra`s algorithm if the heuristic is used. (Corresponding paper: A heuristic for Dijkstra`s algorithm with many targets and its use in weighted matching algorithms. Algorithmica, 36(1):75-88, 2003.) The second and third result are about the extension of smoothed complexity to the area of online algorithms. Smoothed complexity has been introduced by Spielman and Teng to explain the behaviour of algorithms that perform well in practice while having a poor worst case complexity. The idea is to add some noise to the initial input instances by perturbing the input values slightly at random and to analyze the performance of the algorithm on these perturbed instances. In this work, we apply this notion to two well-known online algorithms. The first one is the multi-level feedback algorithm (MLF), minimizing the average flow time on a sequence of jobs released over time, when the processing times of these jobs are not known. MLF is known to work very well in practice, though it has a poor competitive ratio. As it turns out, the smoothed competitive ratio of MLF improves exponentially with the amount of random noise that is added; on average, MLF even admits a constant competitive ratio. We also prove that our bound is asymptotically tight. (Corresponding paper: Average case and smoothed competitive analysis of the multi-level feedback algorithm. In Proceedings of the Forty-Fourth Annual IEEE Symposium on Foundations of Computer Science (FOCS), pages 462-471, 2003.) The second algorithm that we consider is the work function algorithm (WFA) for metrical task systems, a general framework to model online problems. It is known that WFA has a poor competitive ratio. We believe that due to its generality it is interesting to analyze the smoothed competitive ratio of WFA. Our analysis reveals that the smoothed competitive ratio of WFA is much better than its worst case competitive ratio and that it depends on certain topological parameters of the underlying metric. We present asymptotic upper and matching lower bounds on the smoothed competitive ratio of WFA.

Universitaet des Saarlandes
doi.org/10.22028/D291-23756

Schäfer, G. (2004, April 30). Worst case instances are fragile : average case and smoothed competitive analysis of algorithms. Retrieved from http://dx.doi.org/10.22028/D291-23756