A thought experiment:
A well-meaning colleague, knowing you’re the “map person”, approaches with a seemingly straightforward request.
“Where can I get the most up-to-date file of the official US state boundaries?”
There are dozens of places to get such a file–but the most up-to-date? Wanting to save professional face, you surreptitiously google “us states shapefile” and see the following top three entries:
- a 163 kb file from ESRI, undetermined date
- a 23 mb file from NOAA, 1:2M scale, “Valid date–2012″, Last modified date–2007
- two choices from the National Atlas: 1:2M 6.8mb file dated June, 2005; 1:1M 10.6mb file dated July, 2012.
And that, in a nutshell, is a problem that a National Spatial Data Infrastructure (NSDI) should solve, right?
* * * * * * *
The FGDC recently released a NSDI Strategic Plan draft document for public comment. And I was informed I was supposed to have an opinion. So your dutiful blogger, reluctantly disturbed his summer reverie by downloading the PDF and had a look.
What I found was a combination of geo-is-important boilerplate and marketing copy for a technology called the Geospatial Platform. When I see 10 bullet points for “The Desired Future State of the NSDI”, I wonder if we’ll end up any closer to solving our “most up-to-date US state boundaries” conundrum. Given the documented shortcomings of past FGDC performance I imagine my skepticism is broadly shared.
Infrastructure? We Have Infrastructure Everywhere
I’ve covered in the past how we’re in an era of unprecedented private sector investment in spatial data infrastructure (see–Google, Apple, Microsoft). So when I read
“The National Spatial Data Infrastructure extends far beyond data”
my reaction is let’s not extend anywhere until we get the data piece right.
This suspicion of scope creep is deepened when we read that
“the Geospatial Platform initiative is a critical component for the continued development of the NSDI.”
What exactly is this Geospatial Platform?
“The Platform is a Web-based first generation service environment that provides access to a suite of well managed, highly available, and trusted geospatial data, services, applications, and tools for use by Federal agencies and their State, local, Tribal, and regional partners.”
That there is some enterprise-ready commercial vendor marketing copy.
The larger point is the FGDC should only have a single-minded focus on data, and leave the tools and applications to others. I’m happy to pay my Water Utility bill every month, but I don’t need them selling me my bathroom sink as well.
Do a word search of the proposal and you know what terms don’t come up?
The failure of the geospatial community to search-engine optimize its content is why when you google your address you’re much more likely to find real estate content than, say, the property boundary from your county GIS department. To Geospatial Professionals™ “authoritative” means accuracy and precision, to normal people what’s authoritative is what’s atop the first page of Google results. An SDI that doesn’t confront that fact is one whose Portal/Platform-centric approach is destined for irrelevance.
The most important step forward in spatial data infrastructure this year has been Github adding visualization support for data in the GeoJSON format . Though originally a platform for managing programming code, Github has evolved into a platform to manage the sharing of data as well.
A Modest Proposal
What does a core set of curated geospatial data files maintained on Github look like?
A lot like Nathaniel Kelso’s Natural Earth project (main site , Github ). Kelso states his motivation for the project–
“In a time when the web is awash in geospatial data, cartographers are forced to waste time sifting through confusing tangles of poorly attributed data to make clean, legible maps.”
So here’s my proposal: why don’t we take 5% of the taxpayers’ money we’re poised to hand over to a commercial vendor to create the Geospatial Platform and use it post the official US government up-to-date versions of the data found on Natural Earth on Github too. That way users of data can “fork” and “watch” the repository, etc. ensuring everyone–the government, its citizens, and the private sector busy creating tools/interfaces/services–is working off the latest-and-greatest most authoritative version. In other words, can the FGDC replicate for a small sliver of the NSDI budget the pro bono efforts of Kelso and his small group of volunteers?
Infrastructure is Easy, Culture is Hard
In one respect throwing money at a vendor to create a white-label version of their cloud-based platform and calling it the national SDI is the easiest option. Because the hard part isn’t standing up a portal, it’s ensuring that the agencies that “own” the various datasets in a common authoritative repository make a credible long-term commitment to keeping key data up-to-date in a complete, transparent manner.
Fresh, accurate, an open government data has never been in higher demand. Let’s make that the singular focus and de-emphasize the “infrastructure” bit which seems a little too much like a mere continuation of two decades of benefits falling mostly to vendors rather than end-users.