The total complexity of the cells in a line arrangement that are cut by another line is at most 15n/2. The complexity of cells cut by a convex k-gon is O(n α(n,k)). The first bound is tight, but it remains open whether the second is, or whether only linear complexity is possible.
Discusses the expected behavior of Delaunay triangulations for points chosen uniformly at random (without edge effects). The main result is that within a region containing \(n\) points, the expected maximum degree is \(O(\log n / \log\log n)\).
Any simple polygon can be triangulated with quadratically many nonobtuse triangles. Mostly subsumed by recent results of Bern et al described in "Faster circle packing".
One standard way of constructing Delaunay triangulations is by iterated local improvement, in which each step flips the diagonal of some quadrilateral. For many other optimal triangulation problems, flipping is insufficient, but the problems can instead be solved by a more general local improvement step in which a new edge is added to the triangulation, cutting through several triangles, and the region it cuts through is retriangulated on both sides.
An investigation of 3d visibility problems in which the viewing position moves along a straight flight path, with various assumptions on the complexity of the viewed scene.
In this paper, we construct triangulations of point sets and polygons by using quadtrees to add extra vertices to the input. As a result we can guarantee that all triangles have angles bounded away from zero, using a number of triangles within a constant of optimal; this was the first paper to provide simultaneous bounds on mesh element quality and mesh complexity of this form, and therefore the first to provide finite element mesh generation algorithms that guarantee both the robustness of the algorithm against unexpected input geometries and the quality of its output.
In the same paper we also use quadtrees to triangulate planar point sets so that all angles are non-obtuse, using linearly many triangles, and to triangulate higher dimensional point sets with no small solid angles and a number of simplices within a constant of optimal. Also, we can augment any higher dimensional point set so the Delaunay triangulation has linear complexity.
In later follow-up work, I showed that the same technique can also be used to find a triangulation whose edges have total length within a constant factor of optimal. Bern, Mitchell, and Ruppert showed that alternative methods can be used to triangulate any polygon without obtuse angles; see "Faster circle packing with application to nonobtuse triangulation" for an algorithmic improvement to their technique. Additionally, with Bern, Chew, and Ruppert, we showed that any point set in higher dimensions can be triangulated with nonobtuse simplices. Bern and I surveyed these and related results in our paper "Mesh generation and optimal triangulation".
(Preliminary copy of journal version)
Given a d-dimensional set of n points, the number of combinatorially different minimum spanning trees that can be formed by adding one more point is within a polylogarithmic factor of nd.
Follows up "Polynomial size non-obtuse triangulation of polygons"; improves the number of triangles by relaxing the requirements on their angles. Again mostly subsumed by results of Bern et al described in "Faster circle packing".
Considers both heuristics and theoretical algorithms for finding good triangulations and tetrahedralizations for surface interpolation and unstructured finite element meshes. Note that the online copy here omits the figures; also online is this paper's bibliography.
For many geometric graph problems for points in the unit square, such as minimum spanning trees, matching, and traveling salesmen, the sum of edge lengths is O(sqrt n) and the sum of dth powers of edge lengths is O(log n). We provide a "gap theorem" showing that if these bounds do not hold for a class of graphs, both sums will instead be Omega(n). For traveling salesmen the O(log n) bound is tight but for some other graphs the sum of dth powers of edge lengths is O(1).
A parallelization of the quadtree constructions in "Provably good mesh generation", in an integer model of computation, based on a technique of sorting the input points using values formed by shuffling the binary representations of the coordinates. A side-effect is an efficient construction for the "fair split tree" hierarchical clustering method used by Callahan and Kosaraju for various nearest neighbor problems.
Any d-dimensional point set can be triangulated with O(nceil(d/2)) simplices, none of which has an obtuse dihedral angle. No bound depending only on n is possible if we require the maximum dihedral angle to measure at most 90-epsilon degrees or the minimum dihedral to measure at least epsilon. Includes a classification of simplices in terms of their bad angles.
Given a set of points with weights that are not known precisely, but are known to fall within some range, considers the possible weighted centroids arising from different choices of weights in each range. The combinatorics of this problem are closely connected with those of zonotopes.
(BibTeX – Citations – CiteSeer – ACM DL)
Considers problems for which no polynomial-time exact algorithms are known, and concentrates on bounds for worst-case approximation ratios, especially those depending intrinsically on geometry rather than on more general graph theoretic or metric space formulations. Includes sections on the traveling salesman problem, Steiner trees, minimum weight triangulation, clustering, and separation problems.
We study finite element mesh smoothing problems in which we move vertex locations to optimize the shapes of nearby triangles. Many such problems can be solved in linear time using generalized linear programming; we also give efficient algorithms for some non-LP-type mesh smoothing problems. One lemma may be of independent interest: the locus of points in Rd from which a d-1 dimensional convex set subtends a given solid angle is convex.
We consider the problem of "connect the dots": if we have an unknown smooth curve from which sample points have been selected, we would like to find a curve through the sample points that approximates the unknown curve. We show that if the local sample density is sufficiently high, a simple algorithm suffices: form the Delaunay triangulation of the sample points together with their Voronoi vertices, and keep only those Delaunay edges connecting original sample points. There have been many follow-up papers suggesting alternative methods, generalizing the problem to the reconstruction of curves with sharp corners or to curves and surfaces in higher dimensions, etc.
We use circle-packing methods to generate quadrilateral meshes for polygonal domains, with guaranteed bounds both on the quality and the number of elements. We show that these methods can generate meshes of several types:
We apply techniques from "Quadrilateral meshing by circle packing" to a magic trick of Houdini: fold a piece of paper so that with one straight cut, you can form your favorite polygon.
We consider several variations of the problem of coloring the squares of a quadtree so that no two adjacent squares are colored alike. We give simple linear time algorithms for 3-coloring balanced quadtrees with edge adjacency, 4-coloring unbalanced quadtrees with edge adjacency, and 6-coloring balanced or unbalanced quadtrees with corner adjacency. The number of colors used by the first two algorithms is optimal; for the third algorithm, 5 colors may sometimes be needed.
We show that, for any set of n points in d dimensions, there exists a hyperplane with regression depth at least ceiling(n/(d+1)). as had been conjectured by Rousseeuw and Hubert. Dually, for any arrangement of n hyperplanes in d dimensions there exists a point that cannot escape to infinity without crossing at least ceiling(n/(d+1)) hyperplanes. We also apply our approach to related questions on the existence of partitions of the data into subsets such that a common plane has nonzero regression depth in each subset, and to the computational complexity of regression depth problems.
We prove the existence of polyhedra in which all faces are convex, but which can not be cut along edges and folded flat.
Note variations in different versions: the CCCG one was only Bern, Demain, Eppstein, and Kuo, and the WCG one had the title "Ununfoldable polyhedra with triangular faces". The journal version uses the title "Ununfoldable polyhedra with convex faces" and the combined results from both conference versions.
This is the report from the ACM Workshop on Computational Topology run by Marshall and myself in Miami Beach, June 1999. It details goals, current research, and recommendations in this emerging area of collaboration between computer science and mathematics.
We generalize regression depth to k-flats. The k=0 case gives the classical notion of center points. We prove that for any set of n points in Rd there always exists a k-flat with depth at least a constant fraction of n. As a consequence, we derive a linear-time (1+epsilon)-approximation algorithm for the deepest flat. The full version of this paper also includes results from "Computing the Depth of a Flat".
We compute the regression depth of a k-flat in a set of n d-dimensional points, in time O(nd-2), an order of magnitude faster than the best known algorithms for computing the depth of a point or of a hyperplane. The results from this conference paper have been merged into the full version of "Multivariate Regression Depth".
(SODA talk slides – SODA paper)
We give linear-time quasiconvex programming algorithms for finding a Möbius transformation of a set of spheres in a unit ball or on the surface of a unit sphere that maximizes the minimum size of a transformed sphere. We can also use similar methods to maximize the minimum distance among a set of pairs of input points. We apply these results to vertex separation and symmetry display in spherical graph drawing, viewpoint selection in hyperbolic browsing, and element size control in conformal structured mesh generation.
We use the ellipsoid method to develop (theoretically) efficient algorithms for optimizing linear functions on intersections of zonotopes, and show how to apply this to train soft-margin support vector classifiers.
We examine flips in which a set of mesh cells connected in a similar pattern to a subset of faces of a cube or hypercube is replaced by cells in the pattern of the complementary subset. We show that certain flip types preserve geometric realizability of a mesh, and use this to study the question of whether every topologically meshable domain is geometrically meshable. We also study flip graph connectivity, and prove that the flip graph of quadrilateral meshes has exactly two connected components.
Note that the Meshing Roundtable version was by Bern and Eppstein. Erickson was added as a co-author during the revisions for the journal version.
(Slides)
Natural neighbor interpolation is a well-known technique for fitting a surface to scattered data, with some nice properties including smoothness everywhere except the data and exact fitting of linear functions. The interpolated surface is formed from a weighted combination of data values at the "natural neighbors" (neighbors in the Delaunay triangulation), with weights related to Voronoi cell areas. We describe a variation of natural neighbor interpolation, using different weights based on Delaunay circle angles, that remains invariant when the data is transformed by Möbius transformations, and reconstructs harmonic functions in the limit of dense data on a circle.
We consider the problem of finding a large color space that can be generated by all units in multi-projector tiled display systems. Viewing the problem geometrically as one of finding a large parallelepiped within the intersection of multiple parallelepipeds, and using colorimetric principles to define a volume-based objective function for comparing feasible solutions, we develop an algorithm for finding the optimal gamut in time O(n3), where n denotes the number of projectors in the system. We also discuss more efficient quasiconvex programming algorithms for alternative objective functions based on maximizing the quality of the color space extrema.
Co-authors – Publications – David Eppstein – Theory Group – Inf. & Comp. Sci. – UC Irvine
Semi-automatically filtered from a common source file.