[Xastir] Map Load Speed

Tom Russo russo at bogodyn.org
Sat May 24 00:09:05 EDT 2008


On Fri, May 23, 2008 at 10:25:53PM -0400, we recorded a bogon-computron collision of the <n1mie at mac.com> flavor, containing:
> Is there a way to speed up the loading of maps? Specifically, when I  
> use the line maps and dbfawk conversions it is wicked slow anytime I  
> am zoomed out beyond 64. The further out, the slower it is. It is  
> about intolerable beyond 128. So generally I keep it zoomed in to 64  
> or less. Often that is good enough, but for a large scale event (like  
> the one I am to attend this weekend) I would like to see a broader  
> map without sacrificing speed every time I make an adjustment.

Yes, but you have to do some work.

The problem is that as you zoom out, more and more of the shapes in the
shapefile are within the screen area, but the more you zoom out, the fewer
of them are ultimately displayed due to settings in the dbfawk file.  What
is happening is that every one of those shapes is scanned and matched to
the dbfawk file, and then discarded.


To speed up rendering at high zoom levels, you'll have to create multiple
shapefiles with only features that are meant to render at close zooms, and
then exclude those shapefiles from being loaded at high zooms by using the map 
properties.

For example, the TIGER/Line 2006_SE files with the stock tgr2shp.dbfawk
file renders roads with CFCC values A1 to A3 at all zoom levels at or below
8192.  CFCC values A31 through A36 are only rendered at or below zoom 256,
and CFCC values A37 through A38 are only at zoom 128 or below.  There are
still other features that are only displayed at zoom levels of 64 or 96.

You could use ogr2ogr and a "where" clause to create new shapefiles that
contained only those features that should be rendered at 128 or below, 
only 256 or below, and above 256, then use the min and max zoom feature
to exclude the low-zoom files completely --- they wouldn't even be scanned.

This is a good bit of work to do, but if you really want fast map rendering,
you have to make the job xastir is doing easier for it.  It's doing a lot
of file access and processing for every shapefile, and the farther you're
zoomed out, the more work it's doing.  That it's doing all that work only
to decide not to display something is the issue, and the above technique will
fix it.  But the price is that you'll have split the monolithic county 
shapefiles

As a quick example, you could separate out all of those roads that are
supposed to be displayed at the high zoom levels (those with A1 and A2)
with:

 ogr2ogr -where "CFCC like 'A1%' or CFCC like 'A2%'" output1.shp input.shp

where input.shp is one of the TIGER 2006_SE polyline files.  You could select
all of the features that are NOT in that group with:

 ogr2ogr -where "CFCC like 'A%' and not (CFCC like 'A1%' or CFCC like 'A2%')"  output2.shp input.shp

Then tell Xastir to load output1.shp at all zoom levels, but output2 only at 
zoom levels less than 8192.  That would eliminate all the local roads from 
those wide zooms, where they wouldn't be displayed anyway.

Obviously, you'd have to do a *lot* more work than just those two extractions,
since there are many more things that get decided by the dbfawk file.  But
that's how you'd go about it, using the dbfawk's "display_level" setting
to guide you.

-- 
Tom Russo    KM5VY   SAR502   DM64ux          http://www.swcp.com/~russo/
Tijeras, NM  QRPL#1592 K2#398  SOC#236 AHTB#1 http://kevan.org/brain.cgi?DDTNM
 "It's so simple to be wise: just think of something stupid to say and
  then don't say it."  --- Sam Levinson




More information about the Xastir mailing list