Codemash 184.108.40.206 was a great time. I always run into some of my favorite people in the development community. This year was no different, and to add to the fun I was able to share some of my experience using Free and Open Source GIS tools by giving one of the talks.
I am making the slide deck available here for anyone who wanted to snag a few of the links. In case you don't want to go through the powerpoint looking for them, here they are:
Server Side Tile Generation
Importing Spatial Data
Tools / Tutorials
A couple years ago my wife bought me an electric smoker. Really nothing fancy or complicated; it's essentially a metal drum with an electric heating element and a few racks. Now that I've had it for a while I recommend it to everyone. You can get some great tasting meat from some really cheap cuts, and on top of that it's fun to experiment with flavors.
So far I've smoked 5 cuts: beef brisket, pork shoulder, chicken breasts, a whole turkey (Thanksgiving!) and a whole chicken. The pork shoulder is what I started out with because it's got so much fat interwoven throughout the cut that it's really hard to dry out and ruin. The brisket took a few tries to get right, and I've yet to successfully smoke a chicken (more on that later).
I got the smoker for Christmas, so I had thought that it might be a little impractical in the cold weather. So I waited until the weather warmed up a little - into the 50s maybe (if that). It was fine. Since then I've used the smoker during light rain and 40s. I think it will add some to the cook time, but it will work.
Yesterday we had a record breaking 101 degrees in Columbus, Ohio where we live and I smoked a brisket for 20 hours. It really didn't seem to cook much differently than when I smoked one a few weeks ago at around 75-80 degrees.
One thing I recommend spending a little money on is a probe thermometer. Keeping the lid on the smoker is crucial to keeping the heat in. Many probe thermometers have remotes that you can read from indoors to keep you from having to continually open the smoker and check temperature. Another reason for a probe thermometer is that the smoker colors the meat so much that it's easy to think the meat is done when it's still got some time left to cook. With turkey and chicken the smoke can add a pink tinge to the meat color which makes it appear under-done.
The model of smoker I have does not have a thermostat. I've often thought that this might be a nice feature - but keeping an eye on the meat temperature to keep from getting over or underdone seems to be enough to come out with a solid product.
The first couple times I tried to smoke a whole chicken I did so without a thermometer. The first time I undercooked the meat and the second time it dried out so much it wasn't really any good to eat.
Many smokers come with a bowl for water that sits somewhere above the heat source. It is really important to keep water in the bowl during cooking. It makes a noticeable difference in the juiciness of the meat.
The first few times I smoked I put the meat on the bare rack and let the drips fall wherever they happened to go. This lets a lot of smoke into the meat, but the cleanup is not really worth it. Instead I get disposable aluminum cooking trays and put the meat inside. The smoke is still able to penetrate through the open top, and I can also add a couple cups of water in the pan for added moisture. For briskets and pork shoulder I take the meat out after the first hour and a half and cover it with aluminum foil. Since 90% of the smoking is already done at this point, it just serves to keep moisture locked in.
Experimenting with wood flavor is a lot of fun. I've tried apple, mesquite, hickory, and peach. For pork shoulder my favorite is hickory and apple for brisket. I didn't care at all for the mequite. The key with the wood is to not use too much. The smoker will really only "smoke" for about the first hour and a half. The rest of the time is cooking. Getting too much smoke will make the meat taste like a hot dog. Of course another way to get different flavors into the meat is by concocting rubs. What I do is get a bowl and start dumping stuff in. I like a combination of cayenne, black pepper, ground cumin, salt, garlic powder, onion powder, and paprika. I've read that leaving the rub on for a day in the fridge helps get that flavor locked in - I tried it and it really doesn't make much difference.
For thanksgiving turkeys we smoke with bacon laid out over the top of the bird. This keeps the turkey basted the majority of the cook time. My suggestion here is to find unsmoked bacon. The smoked bacon gives the outside layer of meat an over-smoked flavor that is actually somewhat unpleasant.
Any of the meat I've smoked I enjoy eating straight - no buns or sauces (needing sauce is a sign of failure!). As soon as the meat comes out of the smoker I prepare it for storage in the fridge. With the pork shoulder and brisket I pull it apart and remove the fat, but then at the end pour the juices it was cooking in over the meat. The juice is full of the flavors of the rub you used as well. Keeps it nice and juicy when warmed up in the microwave or over the stove.
For visualizing spatial queries there are several open source tools that can do the job. The sweet spot for these tools seems to be when you have the need to view an entire "layer". Layer here is synonymous with either a spatial database table, or a shapefile (http://en.wikipedia.org/wiki/Shapefile). This worked for a majority of the visualization I needed, but other times I just wanted to run a spatial query and see the result on a Google map:
This is a screenshot of a simple tool I created to input a SQL query and get a map vector overlayed on the Google map. As long as the column being returned in the SELECT is wrapped with ST_AsGeoJSON, the vector will render. GeoJSON is just JSON in a specific format, and it is awesome. As you can see, PostGIS has ST_AsGeoJSON() in there ready to go.
The dropdown in the screenshot shows each of the connection strings defined in the web.config.
The code is written for ASP.Net MVC and PostGIS, and you can get it here.
The past 6 or 8 months have had me deep into geospatial technology. Going from knowing nothing to releasing a production grade application has taken me through the spectrum of fun and frustrating. There is so much that I've learned I want to document it here in hope that it will help someone else.
From the starting line, one of the first decisions to make is a geospatially aware database system. Unless you're only dealing with a few small spatial queries, your DBMS is going to be doing the bulk of the heavy lifting. In the case of the application I am writing, the database wound up being my primary abstraction layer. 80% of the application layer is relegated to parsing HTTP requests and turning them into SQL commands.
In the mainstream there are many choices:
- PostgreSQL with PostGIS
- Microsoft SQL Server
- Oracle Spatial
Plan on paying to play for numbers 3 and 4. The rest are Free and Open Source Software (FOSS).
The shop I'm at develops primarily with SQL Server 2008 so it was a natural first choice, but quickly became frustrated at the lack of support from some of the open source GIS tools. Many didn't support drivers for SQL Server and some that did did not support it well. I'm sure this will change in the future. There are some other deficiencies such as lack of support for some aggregate spatial functions such as STUnion (http://msdn.microsoft.com/en-us/library/bb933914.aspx). This particular functionality was addressed in SQL Server 2012.
PostgreSQL/PostGIS is a defacto standard in FOSS communities for it's performance and reliability. I'll go out on a limb and say that any FOSS GIS product will support PostGIS. This was critical for my project as it was a goal to use as little commercial software as possible. Since beginning development with PostGIS 8 months ago, there has never been any regret. Because GIS is an extremely fun and rewarding niche of computer science, there are huge numbers of enthusiasts that are motivated to contribute. Since these folks are so passionate about this technology, I have found that they build perfection into their work.
Another aspect of PostGIS that helped me tremendously was the book PostGIS in Action (http://www.amazon.com/PostGIS-Action-Regina-Obe/dp/1935182269). This was not only a great book for learning PostGIS, but also the fundamentals and history of mapping, coordinate systems, and cartography.
Since just releasing our first GIS application this week I now have a much better appreciation for cartography and the work that goes into creating beautiful mapping applications. It has been a joy to become involved in the open source GIS community and interact with some extremely talented and passionate people. I owe these folks a ton of props for their contributions to open source projects, and helping me come up to speed.
There's nothing more frustrating than a slow running query, and that is exactly what I had while working with a PostgreSQL database recently. Actually it wasn't the query that was slow, but my implementation of PostgreSQL's functions. I'm documenting a solution here in hope that someone else will find this when having similar problems.
The query in question was extremely simplistic, similar to this:
FROM some_table t
WHERE name = 'xyz';
When I ran that guy from the command line I was getting response times right around 10 milliseconds on a table with 500k records, and an index on the name column. Pretty much expected.
Then I tried wrapping up that query into a function:
CREATE OR REPLACE FUNCTION test_function(name_param character(9))
RETURNS SETOF RECORD AS
FROM some_table t
WHERE name = 'xyz';
LANGUAGE plpgsql STABLE;
When I ran that function, the response time was around 500 milliseconds. Turned out that the problem was that the name column in some_table was defined as character varying(9), and the name_param input parameter was defined as character(9). This type mismatch completely threw off the query planner and caused my index to be ignored.