The Hypertext Bibliography Project at MIT also includes listings of my FOCS papers.
The FOCS and Positano versions of this paper merged my results on a dynamic program used for RNA secondary structure prediction, with Raffaele's on sequence comparison. The Bethesda talk and the TR version both used the longer title "Speeding up dynamic programming with application to the computation of RNA structure", and included only the RNA result, which used a mild convexity assumption on certain costs to save two orders of magnitude in total time. This work incited a boom in computational biology within the theory community that is still going strong. But the RNA results were quickly improved by a log factor [Aggarwal et al. at the same FOCS] and never made it into a journal paper.
In this paper, we construct triangulations of point sets and polygons by using quadtrees to add extra vertices to the input. As a result we can guarantee that all triangles have angles bounded away from zero, using a number of triangles within a constant of optimal; this was the first paper to provide simultaneous bounds on mesh element quality and mesh complexity of this form, and therefore the first to provide finite element mesh generation algorithms that guarantee both the robustness of the algorithm against unexpected input geometries and the quality of its output.
In the same paper we also use quadtrees to triangulate planar point sets so that all angles are non-obtuse, using linearly many triangles, and to triangulate higher dimensional point sets with no small solid angles and a number of simplices within a constant of optimal. Also, we can augment any higher dimensional point set so the Delaunay triangulation has linear complexity.
In later follow-up work, I showed that the same technique can also be used to find a triangulation whose edges have total length within a constant factor of optimal. Bern, Mitchell, and Ruppert showed that alternative methods can be used to triangulate any polygon without obtuse angles; see "Faster circle packing with application to nonobtuse triangulation" for an algorithmic improvement to their technique. Additionally, with Bern, Chew, and Ruppert, we showed that any point set in higher dimensions can be triangulated with nonobtuse simplices. Bern and I surveyed these and related results in our paper "Mesh generation and optimal triangulation".
(Preliminary copy of journal version)
Uses Dobkin-Kirkpatrick hierarchies to perform linear programming queries in the intersection of several convex polyhedra. By maintaining a collection of halfspaces as several subsets, represented by polyhedra, this leads to algorithms for a dynamic linear program in which updates change the set of constraints. The fully dynamic results have largely been subsumed by Agarwal and Matoušek, but this paper also includes polylog time results for semi-online problems, and uses them to give a fast randomized algorithm for the planar 2-center problem (later improved by various authors, most recently in "Faster Construction of Planar Two-Centers", which re-uses the data structures described here).
This conference paper merged my results from "Dynamic Euclidean minimum spanning trees" with results of my co-authors on nearest neighbors and halfspace range searching.
Uses a divide and conquer on the edge set of a graph, together with the idea of replacing subgraphs by sparser certificates, to make various dynamic algorithms as fast on dense graphs as they are on sparse graphs. Applications include random generation of spanning trees as well as finding the k minimum weight spanning trees for a given parameter k.
This paper presents an algorithm that finds multiple short paths connecting two terminals in a graph (allowing repeated vertices and edges in the paths) in constant time per path after a preprocessing stage dominated by a single-source shortest path computation. The paths it finds are the k shortest in the graph, where k is a parameter given as input to the algorithm.
The k shortest paths problem has many important applications for finding alternative solutions to geographic path planning problems, network routing, hypothesis generation in computational linguistics, and sequence alignment and metabolic pathway finding in bioinformatics. Although there have been many papers on the k shortest paths problem before and after this one, it has become frequently cited in those application areas. Additionally, it marks a boundary in the theoretical study of the problem: prior theoretical work largely concerned how quickly the problem could be solved, a line of research that was closed off by the optimal time bounds of this paper. Subsequent work has focused instead on devising efficient algorithms for more complex alternative formulations of the problem that avoid the repeated vertices and other shortcomings of the alternative paths produced by this formulation.
The journal version also includes material from a separate 1995 technical report, "Finding common ancestors and disjoint paths in DAGs".
(Full paper – Graehl implementation – Jiménez-Marzal implementations – Shibuya implementation – Martins implementation – Cliff OpenStreetMap demo)
Speeds up 3-coloring by solving a harder problem: constraint satisfaction in which each variable can take on one of three values and each constraint forbids a pair of variable assignments. The detailed solution involves several long hairy case analyses. Similar methods apply also to 3-list-coloring, 3-edge-coloring, and 3-SAT. The 3-SAT algorithm is fixed-parameter tractible in that it is polynomial time when the number of 3-variable clauses is O(log n). Merged into 3-coloring in time O(1.3289^n) for the journal version.
(DIMACS abstract and slides)
We describe algorithms for maintaining the minimum spanning tree in a graph in which the edge weights are piecewise linear functions of time that may change unpredictably. We solve the problem in time O(n2/3 polylog n) per combinatorial change to the tree for general graphs, and in time O(n1/4 polylog n) per combinatorial change to the tree for planar graphs.
We introduce a class of "inverse parametric optimization" problems, in which one is given both a parametric optimization problem and a desired optimal solution; the task is to determine parameter values that lead to the given solution. We use low-dimensional linear programming and geometric sampling techniques to solve such problems for minimum spanning trees, shortest paths, and other optimal subgraph problems, and discuss applications in multicast routing, vehicle path planning, resource allocation, and board game programming.
Conferences – Publications – David Eppstein – Theory Group – Inf. & Comp. Sci. – UC Irvine
Semi-automatically filtered from a common source file.