On top of all that I managed to find a few minutes to do some development, which is pretty good by current standards. One thing I wanted to do was simply make a map link from an object record in our Solr index. Now, Solr URLs have their reserved characters as well as normal URL escaping. XSL, too, with which I transform the Solr output, likes escaped characters. Google Maps URLs, of the sort that you make to overlay KML on a map, well, of course they also require characters in the KML URL parameter to be escaped. The end result is a URL for a map with overlay that looks something like this:
Ugly, huh? [BTW, once I put the new multicore index up this URL won't work]
Escape, escape, escape, and I've had plenty of fun and games in the past trying to escape stuff in XSL the way I want it without XSL then re-escaping or unescaping or otherwise ballsing up the output, so this time I thought, sod this, I'll just make a page to take in a nice simple set of parameters and redirect to the map. This makes it a whole lot easier to write the links in XSLT without worrying so much about the escape nightmare. A link like:
[the "+" can be "%20" instead]
I don't know how much time I saved but I know it only took 5 minutes. It takes in a Solr query, record count and start index, escapes characters as befits GMaps KML URLs, and inserts them into a Solr query URL (including the KML transform bit, of course: wt=xsl&tr=kml.xsl, in our case). This is put into the GMaps URL and we do the response.redirect (yes, it's classic ASP). It's brittle: it will break if the GMaps URL format changes, or if the Solr URL or output format change; but hey, it's simple and works (for now).
It was only after making the script for these pragmatic reasons did I realise that having such a page is, of course, good for several other reasons, including:
- it will give us stats on people following the map links
- that same brittleness is more of a problem if I'm making links like this in lots of scripts and transformations around the site. This way I only need to point all similar links to one script and change that
- if I decide to scrap Google Maps and use, say, OpenStreetMap, or if I want to get my KML from somewhere else, again, one script to change
I will probably add a couple of other parameters but don't want to make it heavy. Specifying the data source is one (other than Solr we can get KML out of, for example, our publications database); specifying the target service is another, so that we could use GMaps, OSM, Yahoo! and so on. Shit, anything but Streetmap (how much do you not miss having to use that piece of crap? Best thing about the last few years in mapping is the fact that you never see that anymore).
I've done some further work this morning, along the lines suggested above. It now takes in a data source and a target service parameter (though the latter only works for GMaps at present), which means I can pull in the publications KML and may start getting MOLA sites by site code too. Much more flexible now, and a single point for all map requests is going to be handy. More work to do to use more powerful aspects of pubs search.
It may seem odd to blog about a 5 minute job when I've been doing much more challenging and complex things that take months, but it's very satisfying when it works so quickly, plus my belated realisation of the useful side effects made me think it was worth talking about. Here's the salient part of the code as it now stands, for interest.
Here's a link to the new script, looking at publications data:
Follow that and see the GMaps URL I now longer have to write!