History: Adler da Silva's Master Thesis

Consistent Line Simplification in Cartographic Maps

Summary

Line simplification is the Cartographic Generalization operation that reduces the complexity of a line, while preserving its main shape features. It is one of the most used operations in this area, because lines correspond to almost 80% of the objects in common maps. Line simplification has been widely studied since the 1960's and many algorithms has been published. A brief review of these algorithms can be found in the PDF document Selection of Basic Algorithms.

Consistent line simplification involves not only the reduction of the line complexity, but also the consistency between the original and the simplified maps. A map is said to be consistent to the original one, if it preserves the topology and keeps a perceptive distance among its objects. Consistent line simplification is a recent field in cartographic generalization. The first works were published in the mid-1990's. Most part of them apply the so-called local simplification, where a single line is simplified at a time and the simplification takes into consideration the objects in its vicinity. The result is a collection of simplified lines that are rejoined together in the simplified map.

This work presents a new algorithm for handling consistency along a map simplification. It takes the relationship among all spatial objects into consideration during the process in a global simplification. This approach has many advantages and results in a excelent performance in comparison with the local simplification. This work also introduces a interesting for generating scale-independent maps that are consistent for all levels of detail. This technique is very useful for multi-scale GIS and can be used in progressive vector data loading or transmission.

Problem Statement

A planar map is basically formed by three types of objecs: points, lines and areas. In the context of simplification, the algorithms are applied over every line and contour of area of the map. They do not need to distinguish between these two types of object. Instead, they see lines and contours of areas just as a set of lines . The result of the application is a set of simplified lines and contours of areas , which should replace their correspondents in the original map to form the simplified map. The inconsistent simplification process is usually called isolated simplification, since one single line at a time and it does not take into consideration any objects in its vicinity. Figure 1 shows the data flow of the isolated simplification process.

Figure 1: Isolated simplification data flow.

In the context of consistent simplification, the algorithms should take into consideration the consistency among the objects. In this case, the input of the algorithms is the whole data of the map. As in simplification, however, the areal objects can be considered as a particular case of lines. Thus, besides the set of lines and contours of areas , the algorithms receives as input a set of points . Nevertheless, the points are not processed, instead they are just used to avoid inconsistent simplification. The outcome of consistent simplification algorithms is a the set of simplified lines and the original set of points , which together form the simplified map.

There are currently two main approachs of consistent simplification: the local simplification and the global simplification. In the local simplification, the lines of are simplified one at a time, but the algorithm takes into consideration the objects in its vicinity to avoid inconsistencies. Figure 2 shows the data flow of the local simplification process.

Figure 2: Local simplification data flow.

In the global simplification, all lines of are simplified together taking into consideration the points of . The main difference of this approach to the local simplification is that it is possible to consider only the simplified versions of lines in the consistency treatment. This will be explained in details in the Algorithm Overview. Figure 3 shows the data flow of the global simplification process.

Figure 3: Global simplification data flow.

Algorithm Overview

Our algorithm consists of a new technique to handle consistency along the simplification of a map. It basically preserves topological and proximity relationships among objects. In other words, it avoids intersections, self-intersections, wrong sidedness and exaggerated proximity. By doing this, it ensures that the information in the simplified map will not collapse, due to consistency errors. The algorithm is composed by three steps: simplification, intialization and correction, as shown in Figure 4.

Figure 4: Algorithm data flow.

1) The Simplification Step

The simplification step is pretty much similar to the isolated simplification process, as you can see in Figure 5. An important advantage of our algorithm over the other algorithms presented in litterature is that we may use any isolated algorithm, that modifies a polyline in a well-defined order and whose result is a subset of the vertices of the original polyline. Applicable algorithms: the Ramer-Douglas-Peucker and Visvalingam techniques.
Figure 5: Simplification step data flow.

2) The Initialization Step

The algorithm is based on an object called Consistency Recovery Unit (CRU). After simplifying the collection of lines from the original map, the algorithms create CRUs for monitoring the consistency between each line segment of the simplified lines and its correspondent in the original lines. In the initialization step, the CRUs are computed and their interdependencies. We demonstrated that those unities can ensure the consistency maintenance along a simplification process. Figure 6 shows the initialization step data flow. The CRUs are highlighted in the figure and represented by the set .

Figure 6: Initialization step data flow.

3) The Correction Step

Detecting topological inconsistencies with respect to the original line, recovering these inconsistencies with use of CRU, and updating CRU, are performed iteratively until all inconsistencies are removed.

Figure 7: Correction step data flow.

Implementation

The implementation of our proposed algorithm may be downloaded by just clicking here. We may classify the codes we developed, in C/C++, in three functional groups:
  1. Data cleaning: All data redundancies are removed.
  2. Consistent Simplification: The proposed simplification procedure is carried out.
  3. Data visualization: The output data, in the inf vector data format, may be visualized.
They build a map simplification pipeline which has, at one end, the maps in ESRI Shapefile Shp or Arc/Info Export e00 format and, at the other end, the images of the simplified maps. All the intermediary data are represented by a simple data format called inf that we have defined.
    This program is free software: you can redistribute it and/or modify
    it under the terms of the GNU General Public License as published by
    the Free Software Foundation, either version 3 of the License, or
    (at your option) any later version.

    This program is distributed in the hope that it will be useful,
    but WITHOUT ANY WARRANTY; without even the implied warranty of
    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
    GNU General Public License for more details.

    You should have received a copy of the GNU General Public License along
    with this program.  If not, see http://www.gnu.org/licenses/.

Publications