Wednesday, August 24, 2011
I'm still not 100% sure what these datasets do so if you know feel free to post a comment.
So if you need to work with this format here are two options:
And if you're into LiDAR you are probably already aware of PyLAS for the LAS format;
Tuesday, August 23, 2011
These two checks are relatively simple however whether they are necessary is up to you and how you apply this test. There are some cases where a boundary point would not be considered for inclusion. Either way now you have an option. This function could even be modified to optionally check for boundary points.
# Improved point in polygon test which includes edge # and vertex points def point_in_poly(x,y,poly): # check if point is a vertex if (x,y) in poly: return "IN" # check if point is on a boundary for i in range(len(poly)): p1 = None p2 = None if i==0: p1 = poly p2 = poly else: p1 = poly[i-1] p2 = poly[i] if p1 == p2 and p1 == y and x > min(p1, p2) and x < max(p1, p2): return "IN" n = len(poly) inside = False p1x,p1y = poly for i in range(n+1): p2x,p2y = poly[i % n] if y > min(p1y,p2y): if y <= max(p1y,p2y): if x <= max(p1x,p2x): if p1y != p2y: xints = (y-p1y)*(p2x-p1x)/(p2y-p1y)+p1x if p1x == p2x or x <= xints: inside = not inside p1x,p1y = p2x,p2y if inside: return "IN" else: return "OUT" # Test a vertex for inclusion poligono = [(-33.416032,-70.593016), (-33.415370,-70.589604), (-33.417340,-70.589046), (-33.417949,-70.592351), (-33.416032,-70.593016)] lat= -33.416032 lon= -70.593016 print point_in_poly(lat, lon, poligono) # test a boundary point for inclusion poly2 = [(1,1), (5,1), (5,5), (1,5), (1,1)] x = 3 y = 1 print point_in_poly(x, y, poly2)You can download this script here.
Saturday, August 20, 2011
Python's "zipfile" module allows you to write files straight from buffer objects including python's StringIO or cStringIO modules. For web applications where you will return the zipped shapefile as part of an http response, you can write the zip file itself to a file-like object without writing it to disk. In this post, the example writes the zip file to disk.
In Python, file-like objects provide a powerful way to re-route complex data structures from the disk to other targets such as a database, memory data structures, or serialized objects. In most other programming languages file-like objects are called "streams" and work in similar fashion. So this post also demonstrates writing shapefiles to file-like objects using a zip file as a target.
Normally when you save a shapefile you call the writer.save method which writes three files to disk. To use file-like objects you call separate save methods for each file: writer.saveShp, writer.saveShx, and writer.saveDbf.
import zipfile import StringIO import shapefile # Set up buffers for saving shp = StringIO.StringIO() shx = StringIO.StringIO() dbf = StringIO.StringIO() # Make a point shapefile w = shapefile.Writer(shapefile.POINT) w.point(90.3, 30) w.point(92, 40) w.point(-122.4, 30) w.point(-90, 35.1) w.field('FIRST_FLD') w.field('SECOND_FLD','C','40') w.record('First','Point') w.record('Second','Point') w.record('Third','Point') w.record('Fourth','Point') # Save shapefile components to buffers w.saveShp(shp) w.saveShx(shx) w.saveDbf(dbf) # Save shapefile buffers to zip file # Note: zlib must be available for # ZIP_DEFLATED to compress. Otherwise # just use ZIP_STORED. z = zipfile.ZipFile("myshape.zip", "w", zipfile.ZIP_DEFLATED) z.writestr("myshape.shp", shp.getvalue()) z.writestr("myshape.shx", shx.getvalue()) z.writestr("myshape.dbf", dbf.getvalue()) z.close()
If you've been using PyShp for awhile make sure you have the latest version. The file-like object save feature was uploaded to the PyShp subversion repository on Aug. 20, 2011 at revision 30.
You can download PyShp here.
You download the sample script above here.
Tuesday, August 16, 2011
|The Tribune's "Englewood" module helps you create very |
scalable dot-desnity maps and is named after a well-known
In a recent post on the Tribune's "News App Blog", they published a module for creating elaborate dot-density maps named "Englewood". They referenced my post "Dot Density Maps with Python and OGR" and turned that sample into the Englewood module named after the beleaguered Chicago neighborhood which often appears in the news.
The Tribune team pulls in several other tools and goes through the details of going all the way from census data to online dot-density maps. In addition to the basic how-to of producing the data they cover how they made the production really fast and deployed it to a massively-scalable S3 Amazon server. The blog gives a lot of insight into how a newspaper uses technology to apply geospatial technology in support of the news. Way more info than you get from your typical code-snippet blog. Fascinating stuff.