Last week while driving to a meeting, I had an epiphany. Those familiar with the state of highways in the Bay Area in the morning rush hour can relate to this. I had to be in Burlingame on the water’s side of 101 at 9:30am. However, it was 8:30am before I could leave from Sunnyvale near 280 south of 85. At that time, no matter which highway I took (280, 85, 92, 101) one hour to get to Burlingame was an impossible task. Yet with real time traffic updates on the GPS and timely decision making about getting off and back on the highways at the right exits, I was able to make it to the cafe in Burlingame at 9:20am. Phew! Well let me assure you that it was no small feat. Now to the epiphany.


Value Chain


Which companies are involved in enabling this efficiency to ordinary commuters like us? It used to be not too long ago, that the radio stations had their helicopters roaming the skies to give traffic updates to their listeners. And once we got inside our cars, we had no way of knowing if just a few miles further down the road there was an accident and a traffic jam.

Let’s understand the value chain at work here. The following categories of companies are involved in making this magic in the traffic and maps ecosystem. (Ref: How GPS Real-Time Traffic Works)


Traffic/Navigation Ecosystem Value Chain

What is enabling these companies? In the last few years, ability to collect all types of data across the Internet and use it for analytics has exploded. With the Internet of Things wave, this is only going to grow exponentially. We have long ago left behind the world of megabytes, gigabytes and terabytes. Petabytes and exabytes are the new currencies of data.


Technology Platform Shifts


Two technology platform shifts have fueled this transition – Cloud Computing and Big Data. The Cloud is essentially a metaphor for the Internet, but broadly speaking, it relates to a set of networked services, served up by virtual machines, so that these services can be scaled up or down transparently to the application developer. Big data, in turn, relates to the collection of data sets, both structured and unstructured, so large, that traditional mechanisms of relational database systems and warehousing were incapable of handling. Newer data management technologies had to evolve to be able to handle such data sets. In come, Hadoop, Hive, Map/Reduce, NoSQL, in-memory databases, etc.


Big Data Infrastructure Value Chain

There is another value chain at play at the infrastructure layer. Now, over time, layers in a value chain consolidate or collapse or get vertically integrated for the sake of efficiencies. Companies like IBM and EMC try to go across layers of the stack from physical storage to analytics to applications. This is a natural evolution of the stack, driven by the ambitions of the players at each level to better control their own destiny. (There are probably more companies in each of these categories, and they may already be present in multiple layers of the stack, so I am happy to receive feedback, corrections, etc.)

Each of the application ecosystems above is already deep into figuring out how to take advantage of the avalanche of data that is available every single moment. Every GPS coordinate, every mouse click, every location check-in, every credit card swipe, every login, and every non-user-generated event like a cell phone connecting to the tower is in some database somewhere, waiting to be correlated with history of past behavior and current adjacencies to derive some meaningful insight that can predict something of value for somebody.

So if you are a real-estate application ecosystem, big data in this context means trying to understanding who buys and sells, from and to whom, what, how, when, why, where. This has implications for the ecosystem value chain, from realtors, real estate agents, home buyers and sellers, etc. Real estate companies are trying to predict which houses are likely to come on to the market in the next three months and can manage their advertising efforts accordingly. (Ref: What big data means for the real-estate industry)


Big Data Product Management


If you are a product manager in one of these Big Data category of companies, you’ve got a hell of a job keeping up with the evolving landscape. How do you go about it?

Every new technology shift and acronym that comes on the horizon is born in the minds of an engineer or an architect. They are the ones looking at the current platforms and stacks and going, “This could be done so much better in ten different ways”. This is true of protocols like SIP, SOAP, XML, and H.264. It is true for middleware architectures like CORBA to SOA. It is true for infrastructure like SDN. And so it is true for methodologies like Agile.

For a product manager, the architects, VPs of Engineering, and senior engineering leaders are great enablers. These people can travel the breadth and depth of this new paradigm, demystify complex concepts, and have a good sense of what scenarios this new paradigm can enable. So aside from the regular train of sprints, scrums, and product releases, it is very important to engage in this level of dialog with the engineering leaders about what this new shift means. As a product manager, I have found this to be invaluable. I have had the opportunity to work with some really sharp and deep thinking folks, and frequently such conversations have been very illuminating. The four types of scenarios to brainstorm are:

➤ What our customers can do that they were not able to do before
➤ What we can do that we were not able to do before
➤ What questions we can now answer for our customers that we couldn’t before
➤ What questions will our customers ask of themselves and/or us that they couldn’t before

If your customer is not the end user of the application ecosystem, but one of the other category players in the stack, it is nevertheless helpful to understand the scenarios from the end user standpoint.


The Last Mile Problem


The real time traffic situation is a great example where all the big data technologies have conspired to deliver end user value. As a driver, I am interested in only a small sliver of the tons of exabytes of data, i.e., traffic around my constantly changing position, just enough for me to make a real-time decision in a very short period of time. And the providers have figured out a way to deliver just what I need in the right moment of time, ensuring that I don’t have to drink from a firehose. In effect, the last mile problem was solved.

In various application ecosystems, the end user is not yet a beneficiary of big data analytics. I am sure efforts are afoot to create more personalized experiences to foster more stickiness, loyalty, etc. However, a lot of scenarios that I read about are allowing the aggregator in the application ecosystem to create better efficiencies for themselves first.

The Product Manager in an application ecosystem must understand the value chain and the various players in the stack. If a category of players is missing, that is a gap, an area of innovation that needs to be filled, either by people below in the stack or above where the gap exists.

To truly address the last mile problem and create benefits for the user, the client experience on the user’s side becomes pivotal. The embedded navigation system solved this for real time traffic, so that it allows us to traverse the map, get traffic updates further up or down on the route, and make real time decisions of where to turn.

On the enterprise side, I get the sense that we are not quite there yet. We see a plethora of dashboards that expose the result of complex analytics in fancy pie charts and graphs, with lights going off on a global heat map, etc. But are we helping our users answer the questions they care about? Is that personalized enough?

In the traffic scenario, I had to get answers to these questions in fairly short order:

➤ What is the road situation ahead?
➤ Where should I get off this road?
➤ Where should I get back on?

The interface has to serve this and quickly get out of the way.

This is where a Product Manager must play an important role. What is the personalized experience the user cares about? What questions does she want answered? Does our client interface distill all our big data collection and analytics down to just those three or four or five key questions that will make a world of difference? What are those questions?

If this is an IT administrator who is looking to protect her enterprise from the latest security threats, or a marketing campaign manager that wants to get better return on her spend, it is the Product Manager’s job to care about this last mile, otherwise we will keep inundating our users with fancy dashboards that only give them a way to admire the problem.

What do you think?