Geography · Economics · Visualization

Mapquest + Mapbox: A Win-Win With A Huge Unanswered Question

In a rare confluence of a languid blogging pace with the tempo of events, my last post on the Verizon purchase of Mapquest was quickly followed by news that Mapquest would be contracting out its map rendering to the high profile start-up Mapbox.  No real follow-up has come out of either camp, including silence from the otherwise prolific Mapbox blog.  But as nature abhors a vacuum, I’ve independently verified the broad strokes of the deal so will happily offer my speculations in the absence of mere facts.

Here’s what each side wins:

Mapquest gets to move fast on a more mobile-centric strategy on the back of the recently launched Mapbox Mobile SDK.  Ditto for being to tap into Mapbox’s work with OpenGL, streaming vectors, and integrating cloud-based DevOps with map usage analytics. And, presumably, an overdue refresh of the Mapquest cartographic style.

But this isn’t a wholesale hollowing out of a brand: Mapquest is keeping its Routing capability in house. Especially in the context of an economy that’s increasingly reliant on-demand logistics, routing is now “hot” again as the ability to adjust routes on-the-fly to real time dynamic conditions is now a competitive necessity for the enterprise.

And should you still be dismissive of Mapquest’s prospects, you can always browse their patent portfolio which makes for interesting reading.

In short: Verizon muscle + Mapbox sexy are a promising combination for a Mapquest resurgence.

Mapbox gets its wares in front of Mapquest’s 40 million unique monthly visitors and an important validation of its vision of being “the map layer for the Internet”.  Funding the next stage of growth won’t be a problem.  Working with TomTom data (the Mapquest provider in the US) will be helpful with potential clients who view the company’s reliance on OpenStreetMap with a degree of skepticism.


The Billion Dollar Question

Who gets access to Verizon’s mobile location data?

As I mentioned in my last post, “noisy” geotagged mobile data in aggregate becomes very valuable, both as behavioral as well as spatial data.  Verizon already sells aggregated/anonymized customer data to 3rd parties, so giving subsidiaries and partners a taste would make sense.

With the rationale of Verizon’s purchase of AOL being “ad tech”, being able to weave in customer location behavior would be a slam dunk for Mapquest.


But will Mapbox be able to get its hands on the Verizon trove?

Good question. But Mapbox CEO Eric Gunderson understands the value of a user data feedback loop:

“Nokia HERE is just doing it wrong,” he says of how the company’s mapping division,  now up for sale, constructs its dataset. “They are spending over a half a billion dollars a year driving cars around and processing that data. That was how you made a map 10 years ago — like back when you had these little Nokia dumb phones. These guys don’t understand the idea of building for mobile and building tools for developers — if they did they would be able to get real time data streams back and have a better map.

As a vocal proponent of OpenStreetMap, it’s tantalizing to think of the boost in quality to OpenStreetMap in the US should Mapbox have access Verizon data streams and be able to put derived data of higher quality back into the commons.  A couple of problems with that vision are a) Verizon could rightly see those derived products as a competitive advantage to be kept in-house and b) the licensing issues around OpenStreetMap continue to be thorny.

For Mapbox and others heavily on reliant on OpenStreetMap, their tech will only be as cool as the quality of data behind it.  Consider Mapbox’s recently launched Bicycle Directions:

Now ponder the possibilities of enriching the directions with data streams of where people actually bike?

So, yeah, data feedback loops are critical because you don’t get a second chance after sending a user into an irrigation ditch.

*  *  *  *  *  *  *

Both Mapquest and Mapbox will immediately benefit from the announced collaboration.  But whether the much bigger prize of Verizon’s mobile location data stream will be seized, and by who, is what’s worth keeping an eye on.


Brian Timoney is an information consultant in Denver, Colorado.


woman working out photo courtesy of Rikard Elofsson’s Flickr stream
elephant in the room photo courtesy of David Blackwell’s Flickr stream



The Massive Potential of Verizon Mapquest

Last week’s announcement of Verizon’s purchase of AOL (which includes Mapquest) was greeted mostly with attempts at humor, with references either to AOL trial offer CDs or the “You’ve Got Mail” movie.  The first dozen entries were mildly amusing before the Law of Diminishing Returns reared its head.

Most of the analysis of the deal focused on mobile ad tech.

I know nothing about ad tech.

But I do know about maps, and from where I sit the ingredients are in place to make a combined Verizon Mapquest the most compelling value proposition this side of the Google geo empire.

*   *   *   *   *   *   *   *

Ingredient #1:     Mobility Data

Critical mass is 100 million US subscribers constantly pinging cell towers and generating spatially referenced information.  And not just your location, but how you’re moving:  we know that movement by auto can be distinguished from pedestrian activity and even bicycle activity.  Sitting still in your office cube agonizing over lunch options can probably be inferred too.

Real-time traffic, crowd, and parking analysis leap to mind.  But think about larger marketing issues. Customer segmentation based on zip code?


Our mobility patterns speak volumes about our consuming habits, and thin-slicing those habits (already being done by particularly data-savvy outfits) are going to be much more valuable when backed by Verizon-scale data streams than the coarse geographic aggregations of old .

Ingredient #2:     Advances in Map Rendering

Google  turned the web mapping world upside down in 2005 when Google Maps gave us a slippy maps—the world as millions of 256×256 pixel tiles that loaded in the background to give the user an amazingly responsive experience.

But everyone got the same set of tiles.

Since then, newer approaches such as streaming vectors plus WebGL display technology mean that we are no longer limited to a one-size-fits-all map experience where everyone is looking at the same set of pre-rendered tiles.  Instead, map features can be filtered, modified, and styled “on the fly” based on dynamic data.  We’re all familiar with traffic overlays–imagine every feature on the map as modifiable based on user behavior and intent.

   I'd like a real-time view of multi-modal mobility with a Tron-ish visual design, please.


If you don’t follow the mapping industry closely, this bit can’t be emphasized enough: the tech is in place to usher in an era of “hyper personalization of maps” where you can tailor visual design to communicate so much more than a generic wayfinding experience.

Ingredient #3:  Noisy Mobile Data in Sufficient Quantity Becomes Valuable Spatial Data

No one should confuse their phone with a survey-grade GPS.  But a fascinating complement to the explosion in data volumes has been the tools to aggregate and algorithmically smooth noisy data and transform it into useful spatial data.  At Verizon-scale quantities of data, you don’t need a tricked out vehicle to survey the new subdivision to figure out where the roads are and how many lanes they have.


   Is this the only way to create map data? Not necessarily.


This is especially important now as the pending sale of Nokia HERE maps has highlighted the dependence on a small handful of base-map providers:  Google, Tom Tom, Nokia Here, and Open Street Map  (Mapquest currently uses Tom Tom and Open Street Map data).  Sure, this data won’t be guiding self-driving cars (thankfully), but I can imagine real-time multi-modal mobility data being very useful for routing the emerging on-demand economy.


Ingredient #4:  The Boring and the Obvious

Sales and support. Existing relationships with large customers.  A phone number clients can call that will be answered by a breathing human being.

We all love cool tech.  The blocking and tackling of everyday sales relationships? Probably less so.  But having the sales infrastructure to go big?  Safe to say we can check that box.  What made the deal such a target of the Mocking Class was that AOL and Mapquest are so strongly associated with Web 1.0  (Mapquest is almost 50 years old, actually).  But you know who doesn’t mind so much dealing with recognizable entities that have been around for decades?  Other companies that have been around for decades and are not immediately stoked to cut huge checks to three year old companies missing vowels from their names.

If this blog has a unifying meta-theme, it’s that Technology is actually the easy part–it’s the Anthropology that’s really hard. Far be it from me to suggest the mere Bigness magically provides all the answers–but scale provides a sturdy safety net for the trial-and-error of determining which part of the cutting edge will work in the marketplace.

*   *   *   *   *   *   *   *

From the outside it’s not clear what Verizon’s intentions are for Mapquest given the scant mention in the press. Whatever the case, the pieces are in place for the original web mapping company to shape the next generation mapping experience.


Brian Timoney is an information consultant in Denver, Colorado.


Google map cars photo courtesy of Jaap Meijer’s Flickr stream





Hopeful Developments In Government Data Publishing. Yes, I Said “Hopeful”


We’ve been talking Open Data for years here at MapBrief, often with a good deal of exasperation at the publishing priorities of otherwise well-intentioned government authorities.

But today I bring only hope.


Open With Apps – A Promising Syndication Model

Back in March announced “open with apps” where some datasets could now be visualized/analyzed directly using Plotly and/or CartoDB with the promise of more services being added.

Why this is good:  it encourages a clean division of labor–government entities focus on data publishing while third party entities focus on visualization and analysis.

A credible, long-term commitment to open data requires resources.  And diverting resources to one-off visualization tools has proved to be an enriching experience only for whatever government contractor built the thing.

And mapping portals? Please, no more.

For a great read on why this single-minded focus data is so important, see this recent Chris Whong post .


Cloud Economics = Competing Platforms

Instead of handing millions to contractors for suboptimal interfaces, let the many cloud-based visualization platforms compete (‘compete’, in like, the capitalism sense of the word) to accept open data feeds and try and convert free-tier users to faithful subscribers through great user experiences.

Worst case scenario: none of these 3rd party platforms find it economically viable over the long run and open data publishers provide a download button and call it ‘done’.


Leave Infrastructure to the Experts

Another hopeful development is the announcement that Amazon is hosting Landsat scenes on its S3 infrastructure.

For free.

Again, the economics of the cloud are such that the experts can provide services at very low cost.  Hence my plea awhile back that the government focus on data, not ‘infrastructure’.  What we don’t need is a big contractor emulating the same service while soaking the taxpayer with a 15-20x mark-up.


*  *  *  *  *  *

The Open Data movement has already won over those persuadable by the idealism of transparent government, etc.  It’s been less successful articulating a nuts-and-bolts economic rationale.  Here’s hoping the jettisoning of wasteful interfaces in favor of 3rd party syndication will encourage a singular focus on the data and nothing but the data.


—Brian Timoney