When recently writing about the shortcomings of map portals, many of my opinions were shaped by the map usage analytics I collected from the City of Denver that formed my most popular post of 2012. Given the popularity of the topic and the large volume of feedback, I circled back to gather more statistics not only from Allan Glen in Denver, but also Jason Birch in Nanaimo, as well a project in Centennial, Colorado that I modeled after Jason’s approach to property information.
And what I found strengthens the case against map portals.
You’re Missing Half Your Audience
One of the major short-comings of map portals is that by jamming a variety of layers into a single interface, it’s very difficult to elicit a user’s intent. Think of how much “smarter” the web is when comparing the Yahoo homepages of the late 1990s with Google’s fast auto-complete text box of 2013. We fully expect Google to find us the needle in the haystack, yet our map portals do about as well as anticipating what we’re looking for as the old Yahoo home page (which, among those of us north of 40, was regarded with wonder–it’s where we started our online day…!). Because Nanaimo, Denver, and Centennial all actively use SEO for their geographic content it’s much more search-engine friendly than layers locked away in portals. Last year we found that 60% of all Denver map traffic came from Google searches: that has risen to be closer to 70%. Meanwhile, Nanaimo rings in at 60%, while the newest initiative in Centennial is already 50% of map traffic. (When we say “Google” we mean all search engines, of which Google accounts for > 90% of traffic).
But here’s the important part: these are feature-specific searches. People don’t go looking for map interfaces or map layers, they Google specific addresses, specific school names, specific libraries, etc.
FACT: If you’re geographic data is not Google-able at a feature-level of specificity, you’re missing half your audience
Most People Are Looking for A Single Feature. Then They Leave.
Websites like to brag about how “sticky” they are: the amount of time spent on the site, how many pageviews generated, etc. If you’re providing the public information, you are not in the business of being “sticky”: provide what the user is looking for as quickly as possible and let them leave. And we can see this in the Nanaimo stats. Jason was one of the first to push the principles of REST for geographic data, and it lets us see the 65% of all users retrieve a single bit of information and leave. Only 12% of users browse information for more than three geographic features during a single visit. The vast majority aren’t visiting a public government website to spend time interacting with maps, toggling layers, and clicking on dozens of placemarks.
If you have a cynical cast of mind, you could be thinking “maybe Nanaimo’s content is so useless and confusing people bail right away?” To that I say, over 10% of their map usage comes from bookmarks and emailed hyperlinks. Because every feature has its own unique URL, it’s much easier to save and pass around. Can individual features in your map portal be bookmarked and passed around?
People Who Ended Up Interacting With A Map, Didn’t Go Looking For A Map
Denver’s Recreation Centers map had over 7000 visits in January. But only 1% of keyword searches that led users to the Recreation Centers map included the words “map”, “location”, or “find”. So even in their Google searches users aren’t thinking “map”, yet on average they end up having three interactions with the map per visit. This is more damning evidence against the idea of putting all your geographical information inside a single portal and labeling it “MAP”.
Don’t Make People Find Maps, Put Them Where People Already Visit
The City of Denver’s web GIS team is in an enviable position: not only do they have the solid evidence of the traffic-enhancing presence of maps, they have an embeddable-widget architecture that lets them easily embed map content anywhere on the city’s website (as well as third-party websites). Most of us have to wrangle with our organization’s
jack-booted HTML thugs “web team” to get a single 8pt-font hyperlink on the home page. So when the Presidential election was approaching, they looked at the web stats and saw, unsurprisingly, that the Election Commission homepage was where the majority of the search traffic landed. Guess what happened when they temporarily embedded a polling location map in that homepage? Election Day and the 24 hours prior blew the doors off of all preceding map usage records.
Beautiful things happen when maps are liberated from the Geo Silo.
The New Marching Orders: Optimize For Single-Feature Search & Retrieval
We already knew that single-topic maps create three times more traffic than the all-in-one portal. This latest round of metrics clearly shows that users want maps that quickly focus them on a single, particular feature of interest. Not only do we have to break our portals down and give the most important layers their own maps, we need to enable the user to rapidly pluck out that lone feature that they’re interested in from your haystack of content. Following the Nanaimo playbook (turn on the ‘notes’) of using REST principles to make every individual feature index-able and bookmark-able, you harness the power of search engines to enable easy discovery and drive traffic to your maps.
As I showed in a previous post, a minimally-viable lookup service based on this individual-feature philosophy can be rolled out at a very low cost.
Next-level flexibility with embeddable map widgets a la Denver enables placing maps where folks naturally end up in searching for information. This is a positive feedback loop in high gear as you’re capturing a significant audience of people who didn’t even know they wanted to interact with a map.
* * * * * * *
Geographic communication is less about technology and more about anthropology: understanding our audience instead of being preoccupied with our tools is what will enable us to rightfully command a position of relevance in the Information landscape.