Geography · Economics · Visualization

The Massive Potential of Verizon Mapquest

Last week’s announcement of Verizon’s purchase of AOL (which includes Mapquest) was greeted mostly with attempts at humor, with references either to AOL trial offer CDs or the “You’ve Got Mail” movie.  The first dozen entries were mildly amusing before the Law of Diminishing Returns reared its head.

Most of the analysis of the deal focused on mobile ad tech.

I know nothing about ad tech.

But I do know about maps, and from where I sit the ingredients are in place to make a combined Verizon Mapquest the most compelling value proposition this side of the Google geo empire.

*   *   *   *   *   *   *   *

Ingredient #1:     Mobility Data

Critical mass is 100 million US subscribers constantly pinging cell towers and generating spatially referenced information.  And not just your location, but how you’re moving:  we know that movement by auto can be distinguished from pedestrian activity and even bicycle activity.  Sitting still in your office cube agonizing over lunch options can probably be inferred too.

Real-time traffic, crowd, and parking analysis leap to mind.  But think about larger marketing issues. Customer segmentation based on zip code?


Our mobility patterns speak volumes about our consuming habits, and thin-slicing those habits (already being done by particularly data-savvy outfits) are going to be much more valuable when backed by Verizon-scale data streams than the coarse geographic aggregations of old .

Ingredient #2:     Advances in Map Rendering

Google  turned the web mapping world upside down in 2005 when Google Maps gave us a slippy maps—the world as millions of 256×256 pixel tiles that loaded in the background to give the user an amazingly responsive experience.

But everyone got the same set of tiles.

Since then, newer approaches such as streaming vectors plus WebGL display technology mean that we are no longer limited to a one-size-fits-all map experience where everyone is looking at the same set of pre-rendered tiles.  Instead, map features can be filtered, modified, and styled “on the fly” based on dynamic data.  We’re all familiar with traffic overlays–imagine every feature on the map as modifiable based on user behavior and intent.

   I'd like a real-time view of multi-modal mobility with a Tron-ish visual design, please.


If you don’t follow the mapping industry closely, this bit can’t be emphasized enough: the tech is in place to usher in an era of “hyper personalization of maps” where you can tailor visual design to communicate so much more than a generic wayfinding experience.

Ingredient #3:  Noisy Mobile Data in Sufficient Quantity Becomes Valuable Spatial Data

No one should confuse their phone with a survey-grade GPS.  But a fascinating complement to the explosion in data volumes has been the tools to aggregate and algorithmically smooth noisy data and transform it into useful spatial data.  At Verizon-scale quantities of data, you don’t need a tricked out vehicle to survey the new subdivision to figure out where the roads are and how many lanes they have.


   Is this the only way to create map data? Not necessarily.


This is especially important now as the pending sale of Nokia HERE maps has highlighted the dependence on a small handful of base-map providers:  Google, Tom Tom, Nokia Here, and Open Street Map  (Mapquest currently uses Tom Tom and Open Street Map data).  Sure, this data won’t be guiding self-driving cars (thankfully), but I can imagine real-time multi-modal mobility data being very useful for routing the emerging on-demand economy.


Ingredient #4:  The Boring and the Obvious

Sales and support. Existing relationships with large customers.  A phone number clients can call that will be answered by a breathing human being.

We all love cool tech.  The blocking and tackling of everyday sales relationships? Probably less so.  But having the sales infrastructure to go big?  Safe to say we can check that box.  What made the deal such a target of the Mocking Class was that AOL and Mapquest are so strongly associated with Web 1.0  (Mapquest is almost 50 years old, actually).  But you know who doesn’t mind so much dealing with recognizable entities that have been around for decades?  Other companies that have been around for decades and are not immediately stoked to cut huge checks to three year old companies missing vowels from their names.

If this blog has a unifying meta-theme, it’s that Technology is actually the easy part–it’s the Anthropology that’s really hard. Far be it from me to suggest the mere Bigness magically provides all the answers–but scale provides a sturdy safety net for the trial-and-error of determining which part of the cutting edge will work in the marketplace.

*   *   *   *   *   *   *   *

From the outside it’s not clear what Verizon’s intentions are for Mapquest given the scant mention in the press. Whatever the case, the pieces are in place for the original web mapping company to shape the next generation mapping experience.


Brian Timoney is an information consultant in Denver, Colorado.


Google map cars photo courtesy of Jaap Meijer’s Flickr stream





Hopeful Developments In Government Data Publishing. Yes, I Said “Hopeful”


We’ve been talking Open Data for years here at MapBrief, often with a good deal of exasperation at the publishing priorities of otherwise well-intentioned government authorities.

But today I bring only hope.


Open With Apps – A Promising Syndication Model

Back in March announced “open with apps” where some datasets could now be visualized/analyzed directly using Plotly and/or CartoDB with the promise of more services being added.

Why this is good:  it encourages a clean division of labor–government entities focus on data publishing while third party entities focus on visualization and analysis.

A credible, long-term commitment to open data requires resources.  And diverting resources to one-off visualization tools has proved to be an enriching experience only for whatever government contractor built the thing.

And mapping portals? Please, no more.

For a great read on why this single-minded focus data is so important, see this recent Chris Whong post .


Cloud Economics = Competing Platforms

Instead of handing millions to contractors for suboptimal interfaces, let the many cloud-based visualization platforms compete (‘compete’, in like, the capitalism sense of the word) to accept open data feeds and try and convert free-tier users to faithful subscribers through great user experiences.

Worst case scenario: none of these 3rd party platforms find it economically viable over the long run and open data publishers provide a download button and call it ‘done’.


Leave Infrastructure to the Experts

Another hopeful development is the announcement that Amazon is hosting Landsat scenes on its S3 infrastructure.

For free.

Again, the economics of the cloud are such that the experts can provide services at very low cost.  Hence my plea awhile back that the government focus on data, not ‘infrastructure’.  What we don’t need is a big contractor emulating the same service while soaking the taxpayer with a 15-20x mark-up.


*  *  *  *  *  *

The Open Data movement has already won over those persuadable by the idealism of transparent government, etc.  It’s been less successful articulating a nuts-and-bolts economic rationale.  Here’s hoping the jettisoning of wasteful interfaces in favor of 3rd party syndication will encourage a singular focus on the data and nothing but the data.


—Brian Timoney








The Road to Bad Policy Is Paved By Good Intentions and Misconceived Maps

One blessing/curse of Twitter has been its singular ability to bring all manner of maps to my timeline and seize my attention one 8-second increment at a time.

Some are interesting/cool/fascinating.

Some make me laugh.

Some are so cringe-worthy, I feel compelled to re-tweet.

And a few leave me muttering “no, no, No, No, NO.”


Because sometimes the stakes are much higher than a graphic artist at an elite publication making rookie cartographic mistakes. Sometimes maps influence policy, guide funding decisions, affect real lives.

This is one of those times.

The above map comes from the recently released “Vision Zero Pedestrian Safety Action Plan: Brooklyn“, part of a larger effort in New York City to address the persistent problem of vehicle/pedestrian collisions.

When Is Heat Map Not A Heat Map

Kenneth Field has a recent thorough post on the all-purpose moniker “heat map” being applied to kernel-density maps. Let me give you my oversimplified Cliff’s Notes version in light of the above map:

a)  ‘Heat maps’ obviously take their name from the familiar temperature map where the actual temperature is measured at specific point locations and a temperature for all of the unmeasured areas is interpolated.  But the map above doesn’t make sense in this context because we know the exact data values for every other intersection in Brooklyn without a dot on it: zero pedestrian deaths.

b)  In creating “hot spot” maps we are also strongly implying that the conditions causing the phenomenon being mapped are propagated generally over space. But when we think about the factors contributing to pedestrian deaths–traffic volumes, pedestrian volumes, street width, speed limits, visibility issues–they have such a geographic specificity that a side street intersection could be very safe although it’s merely a block away from a demonstrably dangerous intersection.

If You’re Diffusing The Problem, Are You Diffusing the Impact of Policy?

To be fair, the report talks about Priority Intersections and Priority Corridors: all to the good, given the pattern of our data points.

But what concerns me are the Priority Areas because of the discussion above of how spatially specific are the conditions that endanger pedestrians.  Consider the item of the Action Plan that calls for “60 new speed bumps in Brooklyn annually”. Where will it be most difficult to install speed bumps: the busiest corridors? Along bus routes?  But the relatively diffuse “Priority Areas” open up the option of making sure we hit our quota of 60 bumps per year by looking at less contested spaces on sleepy side streets.  The plan will be executed, but will the impact be significant?

Representation Matters (Again)

As an outsider, I clearly see the time and effort that went into the report: from collecting and analyzing the data to holding inter-agency meetings and gathering input from the public–all creditworthy.  Often enough objections to cartographic choices are brushed aside for being fussy and pedantic: “my points are correctly located, my North arrow points up, so please shut up.”  And “heat maps” have never been easier to make–so why not create a compelling visual that will be seized on by the press and Twitter?

Because maps frame problems and drive policy solutions. The above maps invite resources to be channeled into areas with relatively low pedestrian risk–muting the impact of precious public dollars.

Despite everyone’s good intentions.


—Brian Timoney