We (I have some partners in crime now) have recently been exploring the application of generalization routines in Arc to one of my excessively detailed published geologic maps. As part of a larger mapping effort (
ND2MP: The Nevada Digital Dirt Mapping Project) I am walking the fine line between the rationality of automated generalization and the impracticality of manually generalizing detailed mapping that I have already completed.
A lot of basic concepts of cartography in general and geologic mapping in particular come to the fore when you start visualizing blotch maps (i.e. those based on polygons) at different scales. Some interesting complexities involving the analog to digital map world also arise...those issues will eventually be aired on the ND2MP blog. For now, I will show some of the results of automated generalization routines in Arc.
The detailed map in question is
NBMG Map 156, a map of Ivanpah Valley, Nevada that was compiled at ~1:12k but was released in Dead Tree Edition at 1:50k so it would fit on the plotter/tree killer.
After perusing various options, we decided that the
'aggregate' generalization tool was the closest to what we wanted...but not exactly what we wanted. This tool melds polys/blotches together on the basis of only a couple of criteria: how close together two like polys can be before they meld into one, and how small the resulting polys (or holes) can be. Both of these concepts involve deciding on a minimum mappable unit (MMU) dimension (a post and discussion for another day fellow mappers).
The map below is an ungeneralized version of a part of the Ivanpah Valley map (in this case the Jean 7.5 Quad) shown at (roughly) 1:150k:
A generalized version wherein two groups of the most intricately mapped surficial units are aggregated is shown below at the same scale (the yellow and red ones):
At face value, the lower map is a bit more legible. In this instance we aggregated like-polys that were less than 40m apart and eliminated polys (in the same group) that were smaller than 5 ha (50,000 sq. meters). We are considering an MMU of 9 ha for a final compilation of Clark County surficial geology to print (yes...I said print) at 1:150k. Note that the centroids of the eliminated polys will be retained as a point data set in case it actually matters that they are gone.
The generalization routine shown above essentially eliminated numerous reaches of narrow, active desert washes. We are interested in retaining these for various reasons, but maybe only as lines. If anyone has a suggestion for how to extract the lines from the eliminated wash reaches as part of the generalization process (or has a suggestion for a better generalization routine) please speak up!
Here are the maps side by side for better comparison: