MapBrief™

Geography · Economics · Visualization

When A Map Goes Viral

For better or worse –ok, worse–CNN’s John King remains the popular face of choropleth mapping in the US for his election night wizardry.  (Our friends across the pond definitely have the better of it with the BBC’s Emily Maitlis–I could watch this crisp analysis, touching on the areal unit problem even, for hours.) But with mapping tools such as Google Fusion Tables, GeoCommons, Tableau, et al,  now well within reach of non-GIS specialists, we are going to see proliferation of all manner of thematic maps on the web.

A few months back, a map of US Passport ownership by state, created by CGP Grey, a blogger with no particular mapping background, went viral, getting picked up by The Huffington Post, Andrew Sullivan, and urban theorist Richard Florida, among many others. Of course my professional eye was immediately drawn to some problems with the cartography, but something larger didn’t seem right:  frankly I didn’t believe the map.  To think that in some of our most populous states (CA, NJ, NY), more than 6 out of every 10 people you pass in the street have a valid passport? That didn’t pass the smell test, and in digging around for answers some larger themes emerged.

First, the obvious carto-geek stuff.  A blue color ramp with a blue background? Not recommended. The lighter hues of blue being associated with higher values should be reversed–increased saturation should reflect “more” of whatever variable is being mapped. If casual mappers take any tip to heart, it should be to visit Colorbrewer.org to get solid color ramps with which to represent your data.  The data breaks chosen work pretty well: round number increments that don’t confuse the viewer while showing the variation in the data, especially the states at each extreme of the distribution.

As for the data, the blogger should be give major props for explaining his assumptions and linking to a Google spreadsheet of his data.  Since state-level data was only available for 2007-2010, he extrapolated backwards another six years (passports are valid for 10 years) to get his estimates. But in 2007 there was a major change in US policy where, as part of the Western Hemisphere Travel Initiative (Orwell-speak alert), passports were made mandatory for travel to Canada, Mexico, and parts of the Caribbean. Using 2007-2010 as a baseline then skews the numbers upwards: indeed his estimate of overall number of passport holders (@ 149M)  is about 30% higher than the actual aggregate number (114M) found here. More importantly, our map no longer says what we think it says since the high estimates would be concentrated in states that border Mexico and Canada, or have a disproportionate number of residents who travel to the Caribbean (think of the immigrant populations of NJ/NY/FL/IL). This fixation on the data is admittedly pedantic, but the map becomes significantly less compelling once these factors are weighed.

~  ~  ~  ~  ~  ~

But let’s not bury an amateur mapper. Because in the cutthroat competition for attention on the Internet, he absolutely nailed it: he found a dataset others found compelling and mapped it in a way where dominant patterns were quickly perceived.  Bluntly put, people inferred the story he was trying to tell. In our world of ubiquitous data the skill of visual storytelling, especially on the ADD savanna of the web, is something we who “do” cartography and GIS on a daily basis would do well to ponder more deeply in our everyday work.

 

—Brian Timoney

 

Open Source on the 21st Century Battlefield

 

“Imagine if only the manufacturer of a rifle were allowed to clean, fix, modify or upgrade that rifle.”

As a fire-team leader in the Marine Corps infantry in the early 1990s, a GPS was considered too costly and sophisticated to entrust to a grubby, lowly corporal such as myself. That the collective GPS/wayfinding technology of a batallion twenty years ago is easily bested by the kit bandied about by the typical suburban Boy Scout troop today is another banal example of the pace of technological change. But what is more interesting to ponder is that GPS was a very 20th century example of a military technology that eventually migrated into the consumer space. What happens when innovation happens in the other direction?

Deputy Defense Secretary William Lynn has a bit where he contrasts the Pentagon’s 81-month IT procurement process with the development of a certain Apple product:  after two years, the former may have an approved budget, but “Steve Jobs is talking on his new iPhone. It’s not a fair trade.”  It’s this recognition of the importance of rapid iterations of innovation and deployment that make the recently released document Open Technology Development (OTD): Lessons Learned and Best Practices for Military Software such an interesting read.  Part philosophical overview, part detailed hashing out of gritty issues such as forkability and licensing, it takes as its premise that the US military is in need of a new “…way of developing, deploying and updating software-intensive systems that will match the tempo and ever-changing mission demands of military operations.”

The premier geospatial open source conference coming to Denver

The largest international gathering of open source geospatial professionals coming to Denver this September

 

An ever-quickening tempo and end-user requirements constantly in flux: does that describe your business environment? As a user of open source geospatial software (alongside commercial products) for the last half-dozen years, I’ve been fascinated by a decided shift in the rationale for open source adoption.

Hint: it’s no longer price.

Indeed, in the OTD document where they tick off the positives of open technology, cost is only mentioned after Increased Agility/Flexibility, Faster Delivery, Increased Innovation, Reduced Risk, and Information Assurance & Security. More than mere white papers, however, the DoD has begun to lay important groundwork for supporting open source development efforts with Forge.Mil (SourceForge-ish collaboration tools), and a more grass-roots military/civillian group Mil-OSS.  From this outsider’s view, it appears there are substantial attempts to work the problem from both the top-level macro view–procurement, licensing, and support issues–and just as importantly, creating a mechanism for bottom-up innovation where the women and men in the field can quickly and easily share best practices, specific problems solved, and field-tested hacks.

~  ~  ~  ~  ~  ~

Perhaps the greater glory of the US military is not an ideological priority of yours.  Fair enough. But in your favorite sprawling,white-collar bureaucracy is software part of the problem or part of the solution?  More to the point, do internal attitudes towards software–the weary cynicism that nothing is to be done and let’s just be patient waiting for the Office 2007/Internet Explorer 8 upgrade–highlight a troubling passivity towards problem-solving in general? If the Department of Defense, where the gears grind exceedingly slow, see software not as a tedious IT procurement issue but as a strategic advantage to be exploited, then one can only hope that the boardroom (and its evergreen fondness for martial metaphor and analogy) will eventually be equally as perceptive.

<beginPitch> If the operational value of open source resonates even faintly within your organization, then we cordially invite you to join us in Denver this September for the FOSS4G International Conference covering all things geospatial and open. In addition to a broad-ranging program covering both beginner as well as deep-dive geeky fare, we’ve designed a standalone one-day program specifically for managers to understand how the different components of geospatial open source mesh together as well as co-exist with commercial assets (e.g. Oracle databases).  Just as important, we’ll  be discussing the financial value proposition of geospatial FOSS in both internal contexts as well as building profitable businesses on top of open source assets. </endPitch>

The contradiction of the 21st century battlefield is that despite the billions spent on sophisticated weapons systems, success is dependent on small-unit decision-making and tactical improvisation.  This crucial element of improvisation is greatly aided by tools that are open to field mods in response to ever-changing requirements.  Or, as memorably summed up by an Army attendee at the inaugural Gov 2.0 Summit, “only pack it if you can hack it.”

 

—Brian Timoney

Open Government is a Slammed Door at the BLM

So maybe having easy access to natural gas lease polygons in Sublette County, WY isn’t a priority for you.  But if phrases such as “open government”, “open data”, “energy independence”, “stewardship of federal lands”, or “government transparency” resonate even a little bit, then what’s going on now at the BLM with the National Integrated Land System (NILS) GeoCommunicator project should concern you.  Because it’s an object lesson in what pretty phrases are the first to be slayed when a large federal agency and a dominant geospatial vendor bollix up a high-profile public web mapping initiative.

Back in May the BLM started to remove key spatial layers from GeoCommunicator:  mining claims, oil and gas leases, rights-of-way etc.  What’s interesting is the reason given for removal wasconcerns of data quality.” Specifically, the spatial boundaries being derived from the legal descriptions stored in the agency’s database of record (LR2000), were not accurate. Now the BLM is also the caretaker of  geographic coordinate database (GCDB) that represents the federal government’s version of the Public Land Survey System (PLSS).  So if there are data quality issues in how textual legal descriptions are being converted in the geospatial boundaries, then the problems here aren’t trivial and speak to fundamental mis-steps in data management and internal QA/QC.

BLM GeoCommunicator Oil & Gas Leases

Halcyon days: remixing GeoCommunicator Oil & Gas lease boundaries and web services in Google Earth

 

Now as someone who uses, modifies, and re-sells GeoCommunicator data to the Oil & Gas industry, I’m far from a disinterested observer.  So when I spotted a colleague from the BLM at a recent social gathering, I naturally wanted to chat him/her up for an update as to what was going on.  The response:

We’re literally not allowed to talk about it.

No longer are we talking simply about a website being down, but something that apparently requires BLM management to impose a mafia-like omerta on its own people. I understand institutional arse-covering, except for the inconvenient fact we’re talking about taxpayer money.

How much money are we talking?  Back in 2001 when the GAO was documenting that BLM’s shortcomings in mismanaging the $411 million ALMRS project, mention was made of NILS forecasted cost being $16.7 million.  Flash forward to 2007, and we see that NILS tab listed at $36.2 million: but this is the proverbial visible tip of the iceberg as those more proximate to the situation speculate the current accumulated tab being closer to a 9-figure sum.  (By the way, in the same document, Geospatial One-Stop–another spatial web initiative–weighs in at $57.9 million…). Unfair it would be to the large number of savvy GIS professionals working at the BLM not to point out the good work in primary data collection and analysis being done, particularly at the state-office level (which has been picking up the slack in data provisioning while GeoCommunicator has been down). But these big, hairy, heavy agency-wide IT initiatives have been, and continue to be, sources of very expensive grief.

Equally unfair would be to omit mention of ESRI: for every mis-managed, cost-overrun step of the way they are pocketing good taxpayer cash.  As the dominant geospatial vendor in the US, their skill in landing the biggest of the Federal GIS contracts goes hand-in-hand with a marketing machine equally adept at churning out frequent dispatches of glossy self-regard. But what of these pricey taxpayer-funded projects that go pear-shaped (GeoCommunicator) or, more commonly, simply don’t deliver a compelling ROI to the public (Geospatial One-Stop)? Reading between the lines, with the BLM saying next-to-nothing but managing to reference “data quality”, one imagines a metaphorical index finger pointed at Redlands.  Just supposition, but the larger frustration to the end-user/taxpayer is the lack of mechanisms for accountability in the agency-vendor relationship.  Such opacity leads to a certain skepticism when the big vendors/federal contractors bang the drum for a “national GIS”: this 2009 proposal by ESRI and Booz-Allen put forth an ambitious (or grandiose) integration plan for federal, state, and local GIS assets for a tidy $1.2 billion.  Top-down, large scale spatial-data-integration-by-directive simply doesn’t have the track record of success to justify the costs, especially in view of how users actually search for and interact with information on the web (hint: search engines, not portals).

Perhaps GeoCommunicator will magically come back to life next week, with a rich interface and maybe even a REST api. But the off-putting behavior of the BLM is of a piece with a larger suspicion that the recent enthusiasm for open data/open government at the federal level has petered out. At the very least, the misty idealism of the open gov movement–bringing closer together citizens and the government designed to serve it–has failed to take into full account the much more intimate relationship between federal agencies and the quite tangible interests of its largest IT vendors.

 

—Brian Timoney