Delaunay Triangulation & Simplification Algorithms
Project Goals
The goal of this project was to explore both the development of a delaunay triangulation algorithm but also to examine quality simplification algorithms.

Phase 1: Basic Simplification and Delaunay Triangulation

Delaunay Triangulation
As one of the main goals of the project, a delaunay triangulation algorithm was written. The first implementation was based on one published by Paul Bourke. It used a simple process of building a "super triangle", then iteratively adding all of the vertices to the triangulation, then removing any triangles connected to the "super triangle vertices." See Phase 2 for discussion of the second implementation.

"One-pass feature" simplification
Garland and Heckbert would call the algorithm developed in phase 1, a "one pass feature method." It uses a single pass algorithm where it applies a number of local simplification criteria to a vertex to decide if it should be included.

Initial Simplification Algorithm
A simple algorithm was used to throw out the non-essential vertices: vertices with-in a threshold of the last used vertex (in X & Y) are thrown out. A simple threshold of 3 height units, reduced the terrains greatly. All border vertices are kept so that the algorithm can be used in a tiled height map engine, with causing T-vertices and cracks.

The terrain to the right consisted of 22,500 vertices and was reduced to 4,827 with a threshold of 3. This reduced the terrain triangle count from 44k to 9k, almost 1/5th the number of triangles. No information other than differences in height values was used for this metric.

It was recognized that an error metric which takes more feature data into account would be necessary but this was the first metric to test the delaunay triangulaiton engine.

A section of DEM terrain, whose triangle count was reduced to 1/5th the original by the project's simplification algorithm and delaunay triangulation.
Including Feature Data in the Simplification
To improve the potential for a more useable triangulation and simplification, the simplification engine was enhanced. An infinite number of simplification objects can be run on each vertex, outputting a decision to either accept the vertex, reject the vertex, or continue evaluating the vertex with the next simplification object. The initial simplification algorithm was adapted to this new architecture. Coloring based upon height values was also added so we can evaluate how well the algorithm handles feature boundaries.

To the right is a larger piece of the terrain run with no simplification. The terrain is rendered solid gouraud shaded triangles, without lighting.

A height colored section of terrain showing the original feature definition. Notice the shoreline and the small island.
Results from the Basic Simplification
Now, the basic simplification algorithm from the first section was run on the terrain. The image to the right illustrates the results when run with a threshold of 8. This was an increase over the threshold used earlier simply to make it more obvious the differences. This simulates the scenarios when a lower polygon count is required. In this case we have reduce the vertex count from 40k to 3328, but at what cost? The shoreline and island have lost quite a bit of definition which would be obvious in a simulation due to the differences in the features.
Basic simplification (Threshold of 8). No feature detection. 3328 Verts.
Feature Identification of Coastline
The simplification process needs to be able to recognize important features. These include shorelines, roads, mountain peaks, etc. The loss of definition with these features when generating lower levels of detail is obvious and distracting. To enhance the feature representation, I added a Simplification object which detects the transition from the shore to the water. It causes these vertices to be automatically accepted for the triangulation.

The results of adding the is simplification object to the process is shown to the right. The vert count has increased from 3328 to 4921 verts but we are still a far cry away from the 40k we started with. Notice the other transitions between colors still show the effects of the basic simplification algorithm. This because we have not tried to preserve them.

Basic simplification (Threshold of 8). Feature detection for the coast line. 4921 Verts.
Feature Identification of All Height Color Transitions
To examine the vertex impact of this technique, Simplification objects were added for all of the height color transitions. The vertex count increased to 6366 verts from 4921 verts. Though a substantial increase we have preserved the features were desire while using 16% of the original vertices!
Basic simplification (Threshold of 8). Feature detection for all height color transitions. 6366 Verts.
Phase 1: Summary
This simple architecture of multiple simplification objects executing on each vertex to determine whether it contributes to the terrain or not was proven to work quite effectively. Simplification objects for maintaining continuity, mountain peaks, roads, building locations, etc could easily be added. These objects could easily draw upon information from other databases for making their determinations (like where buildings are located.) With these techniques, restrictions on acceptible vertices could be automatically (or manually) loosened for lower and lower levels of detail of the terrain.

Phase 2: Constrained Delaunay Triangulation
Phase 2 consisted of adding support for constrained Delaunay Triangulation. The algorithm no longer uses Bourke's super triangle but instead has constraint support for boundaries. The importance of adding this support is that we need to be able to guarantee that we have kept specific edges in the final triangulation because the triangulation must meet nicely with other geometry in the scene.

Phase 3: Iterative TIN, Local Error Metric, & RMS
The simplification algorithm from phase 1 works more on a local basis, testing whether a vertex should be included or not. This doesn't lend itself to producing triangulations exactly of a specific number of vertices, or minimal triangulations for a given error reduction. The next phase of this project addressed these types of issues by using Garland's Local error metric (distance between the actual terrain height and the real value in the heightmap) to determine which vertices will eliminate the most error first, such that the generation of the TIN can be stopped when the error reaches a given threshold. These topics and this phase of the project were a part of the iTIN Project which can be read about by clicking here.

References:

  • Garland and Heckbert, Fast Polygonal Approximation of Terrains and Height Fields, CMU-CS-95-181

glenn@raudins.com