Tuesday, February 3, 2009

Fun (?) with geoGeneralization, Part 1

We (I have some partners in crime now) have recently been exploring the application of generalization routines in Arc to one of my excessively detailed published geologic maps. As part of a larger mapping effort (ND2MP: The Nevada Digital Dirt Mapping Project) I am walking the fine line between the rationality of automated generalization and the impracticality of manually generalizing detailed mapping that I have already completed.

A lot of basic concepts of cartography in general and geologic mapping in particular come to the fore when you start visualizing blotch maps (i.e. those based on polygons) at different scales. Some interesting complexities involving the analog to digital map world also arise...those issues will eventually be aired on the ND2MP blog. For now, I will show some of the results of automated generalization routines in Arc.

The detailed map in question is NBMG Map 156, a map of Ivanpah Valley, Nevada that was compiled at ~1:12k but was released in Dead Tree Edition at 1:50k so it would fit on the plotter/tree killer.

After perusing various options, we decided that the 'aggregate' generalization tool was the closest to what we wanted...but not exactly what we wanted. This tool melds polys/blotches together on the basis of only a couple of criteria: how close together two like polys can be before they meld into one, and how small the resulting polys (or holes) can be. Both of these concepts involve deciding on a minimum mappable unit (MMU) dimension (a post and discussion for another day fellow mappers).

The map below is an ungeneralized version of a part of the Ivanpah Valley map (in this case the Jean 7.5 Quad) shown at (roughly) 1:150k:

A generalized version wherein two groups of the most intricately mapped surficial units are aggregated is shown below at the same scale (the yellow and red ones):

At face value, the lower map is a bit more legible. In this instance we aggregated like-polys that were less than 40m apart and eliminated polys (in the same group) that were smaller than 5 ha (50,000 sq. meters). We are considering an MMU of 9 ha for a final compilation of Clark County surficial geology to print (yes...I said print) at 1:150k. Note that the centroids of the eliminated polys will be retained as a point data set in case it actually matters that they are gone.

The generalization routine shown above essentially eliminated numerous reaches of narrow, active desert washes. We are interested in retaining these for various reasons, but maybe only as lines. If anyone has a suggestion for how to extract the lines from the eliminated wash reaches as part of the generalization process (or has a suggestion for a better generalization routine) please speak up!

Here are the maps side by side for better comparison:

2 comments:

Ryan said...

Don't know how many of these you're going to have to generalize, but if you do end up "manually" generalizing any, I've found that some tools in ArcMap make this pretty easy: Streaming and snapping.

Set up streaming so that as you move your mouse along, it puts down vertices at an interval appropriate to the scale you want to map at. Then set a rather large snapping tolerance so that your cursor will stay snapped to the fine-scale lines you want to generalize. As you move your mouse, the cursor stays on the fine line, and kind of automatically generalizes it by setting vertices a bit further apart. You're also dealing directly with the data, so it's easy to give a little more or less detail where appropriate.

Also play with the trace tool for copying lines or segments of lines directly.

Only other advice for making generalization "funner" would be to think as carefully as you can about the map units BEFORE you start generalizing. Which units will be lumped in the final map? This directly influences which lines you need to transfer/generalize.

Dr. Jerque said...

Ryan...you are always there for some input...thanks for that.

It is probably inevitable that some manual generalization will take place and I will certainly try your suggestion out. Only exception is that I will use my Wacom digitizing tablet or, better, my spanking new Cintiq Interactive pen display(!). These little babies will make your suggested approach even easier. Check out an early post entitled: 'Can you write your name with a mouse? (hint:no)'.

The other constraint on the generalization issue is that the example I provided is a small subset of a map containing more than 6000 polys. Hence the desire for some automation.

Your last comment is well-taken to say the least, we had devised just such a scheme (but it will probably change anyway).