Show Me The Customers: A Former Analyst’s View Of The Big Data Space

January 5, 2016 Jeff Kelly

sfeatured-customers-nodesWhen I was an industry analyst, not too long ago, I spoke with a lot of Big Data vendors. I covered the entire Big Data stack, from infrastructure and hardware up to analytics and data visualization, and sat in on more product briefings than I can remember. The ones that stand out in my mind, however, invariably involved more than just talk of speeds and feeds, features and version enhancements. The common denominator of the memorable briefings was real, honest-to-goodness customer references.

Unfortunately, customer references from vendors were not as common as you might think. When I began covering Big Data as an industry analyst at Wikibon in the spring of 2011, the industry was in its infancy. It was already clear, however, that Big Data promised great opportunity. New vendors were forming at a frenetic pace, each attempting to grab their piece of the pie by commercializing this or that part of the stack. By the time Hadoop Summit 2014 rolled around—my last as an analyst—I counted no less than 80 vendors exhibiting on the show floor!

The level and pace of innovation in the Big Data space was like none I’d ever seen before, and it truly was an exciting time. Multi-billion dollar enterprise software vendors with their expensive proprietary stacks were suddenly feeling real pressure from small startups with new approaches to data management. All of a sudden open source and databases were sexy, and “vendor lock-in” was the equivalent of a four-letter word.

The flip side of this cambrian explosion of Big Data startups, however, was that all these vendors were fighting for the same relatively small pool of customers. It was, and remains, the early days for Big Data, and there were only so many early adopters to go around. While many startups had interesting products built on promising technology, most of them lacked customers. This isn’t uncommon in a frothy, emerging market. But eventually, customer adoption is what separates the contenders from the pretenders.

As an analyst, I considered a vendor’s ability to provide solid customer references to be a key performance indicator. No matter how amazing a new technology or product might be, if nobody is willing to pay for it or run their business on it, there’s only potential. I found that working with Big Data practitioners—entrepreneurs, business leaders and technologists on the frontlines, applying Big Data to transform their businesses and industries—was what I enjoyed most about being an industry analyst. Hearing customer stories, learning from their successes and failures, and in turn advising other practitioners was fascinating and rewarding.

At Pivotal, Customers Lead The Way

Which brings me to today. It’s been a month since I joined Pivotal, one of those Big Data vendors that emerged during the past few years. When I covered Pivotal as an analyst, the company was one of the few that consistently delivered customer references. The Pivotal customers I spoke to during that time were doing some of them most exciting work with Big Data I had seen to date. When the opportunity to join the company presented itself, I jumped at it.

Since joining the company, I’ve had the chance to learn more about Pivotal’s Big Data customers, and I’ve already observed some pretty astounding results. These businesses span many industries—from heavy industry and manufacturing to education to banking—and have implemented numerous use cases. Here are just a few examples of the innovative ways Pivotal customers are putting Big Data to use and creating real value for themselves and their consumers:

BMW

The German car maker is one of the most respected brands in the auto industry. BMW drivers expect top performance and service, and delivering on that promise in 2015 requires Big Data analytics. Speaking at EMC World 2015, BMW’s Dirk Ruger said, “The primary goal at the moment is predictive maintenance, being able to detect defects at the earliest stage. We have to find the right correlation patterns for all our forward memories and incoming data to predict upcoming malfunctions and their consequences.” BMW uses Pivotal Big Data technologies to make sense of all the data streaming from its connected cars so it can catch potential problems before they occur, saving their customers time, money, and frustration.

Purdue University

With Pivotal’s Big Data technologies as the foundation, Purdue University is developing an analytics software platform that will help prospective students better identify their weaknesses and take steps to address them so they are in the best position possible to succeed when they step on campus. The platform will also analyze multiple sources of data to help Purdue University professors identify struggling students more quickly, so they can intervene before it’s too late. “Around something important like education, we like the idea of using even more data sources to help inform decisions and improve student success with higher education. We also like the idea of using data to help faculty teach better,” said Gerry McCartney, CIO and associate professor of IT at the university.

Time

Time is one of the storied brands in media. When Linda Apsley took over as VP of Revenue and Data Engineering, she was happy to find that the company was doing a good job of historical reporting and data analysis. But she knew Time needed to become more data-driven in both its strategic and real-time decision making to compete in this extremely competitive and fragmented media landscape. “The piece we need to add in now are predictive capabilities,” said Apsley, speaking on theCUBE. “So much of what’s happening in the [media] market today is about the individual. We need to be able to target your interests, what it is you want to see.” With Pivotal’s help, Apsley and her colleagues are building out new predictive analytics capabilities and bringing together previously siloed data to better target content to individual consumers across its portfolio of media properties.

WellCare

Perhaps no industry has undergone more change in recent years than healthcare. The Affordable Care Act brought health insurance to millions of previously uncovered Americans, increasing competition among providers. WellCare, a Tampa, Fl.-based managed care provider supporting over 2.8 million members, recognized it needed to move faster and be more data-driven to compete in this evolving market. WellCare partnered with Pivotal to modernize its data infrastructure, with significant results. “In a little over a year of actual implementation, we have gotten…about 73% faster on our ability to produce reports, to do the analytics required to respond to our state and federal partners as well as to effectively close our books, meet our monthly financial reporting obligations,” said James Clark, IT Director at WellCare. “Now, we’re able to really look at on a daily basis with fresh data, look at the way things are trending and be able to more quickly produce that result.”

CoreLogic

Data is the lifeblood of CoreLogic. The Irvine, Calif.-based company provides data-driven insights to its clients in the capital markets, insurance, oil and gas, and other industries through interactive applications. The company has absorbed numerous acquisitions over the years, resulting in siloed data sources and analytics capabilities. “What we needed to get better at was tying together this broad set of disparate data assets into a central platform where we could shape that data and allow either the applications to access it or a data scientist to operate on top of it very quickly and be able to reach across the complete breadth of that data,” said Rob Carpenter, SVP of technology at CoreLogic. The company turned to Pivotal products to form the foundation of its Big Data platform, using Pivotal’s Big Data technologies and Pivotal Cloud Foundry to provide data-as-a-service to its data scientists and application developers. This allows them to quickly deliver fresh insights to its clients in the field, be it a realtor helping a homebuyer evaluate a property or an insurance underwriter making coverage decisions at an accident site.

These Results Are Only The Beginning Of A Conversation

These are only five examples, representing a fraction of the innovation happening among Pivotal’s Big Data customers. There are hundreds more stories to be told, from customers such as General Electric, Ford, SBI Securities, Argentine financial services company GIRE, China Railways Corporation, and many more. Documenting these stories and use cases is what I’ll be doing in my new role at Pivotal. My goal is to extract the knowledge and lessons learned from our Big Data customers and pass them on to you as your organizations embark on the Big Data journey. I want to hear from readers, whether you’re currently a Pivotal customer or not, about what topics are important to you, as well as your successes and lessons learned.

Also, let me know what you think about what you read here—this is a conversation, not a lecture. We’ll be looking at a wide range of Big Data topics, from emerging tools and technologies to the application of Big Data across industries, as well as the implications of Big Data on privacy, policy making and society at large.

This is going to be a fun journey for us all. Until next time!

About the Author

Jeff Kelly

Jeff Kelly is a Principal Product Marketing Manager at Pivotal Software. He spends his time learning and writing about how leading enterprises are tapping the cloud, data and modern application development to transform how the world builds software. Prior to joining Pivotal, Jeff was the lead industry analyst covering Big Data analytics at Wikibon, an open source research and advisory firm. Before that, Jeff covered data warehousing, business analytics and other IT topics as a reporter and editor at TechTarget. He received his B.A. in American studies from Providence College and his M.A. in journalism from Northeastern University.

Follow on Google Plus Follow on Twitter More Content by Jeff Kelly
Previous
Why You’ll Need An Open Source Playbook For 2016
Why You’ll Need An Open Source Playbook For 2016

Open source software, such as the Apache HadoopⓇ standard within the Big Data realm, has become the default...

Next
This Month in Data Science: December
This Month in Data Science: December

As 2015 comes to an end, data science watchers and practitioners look to the year ahead and predict how the...

×

Subscribe to our Newsletter

Thank you!
Error - something went wrong!