Alpine Data Labs Brings Predictive Analytics to Where Data Lives

September 7, 2012 Paul M. Davis

For all the promise of predictive analytics, realizing that potential can be elusive. Moving data is a cumbersome process, and modeling is typically done with complex, standalone applications, forcing those tasked with developing predictive models to work with limited data at the edges of the business. It’s a process marked by what Steven Hillion, Chief Product Officer at Alpine Data Labs, describes as “a lot of disjoint activity, gaps in processing, and different teams.”

In a traditional analytics environment, he says, “first of all, you have to find the data. You have to go out into the world, figure out where it lives, and then you have to move it into your local environment. You get a sample or an extract and start your analysis, and then you find you don’t have enough data, so you go back for more, but that’s painful and slow, so instead you go with this one-shot model approach. And then to use your models, to do something useful, you have to take what you created on your desktop and someone has to convert them to run in production, often manually.”

The bottlenecks that arise can lead to analytics projects that take weeks or months, when businesses need predictive insight now.

A new partnership between Greenplum, a division of EMC, and Alpine Data Labs to tightly integrate Alpine’s predictive analytics application suite with Greenplum’s platform aims to address these bottlenecks, by allowing analysts to build and test models where their data is, rather than the other way around.

“We try to take the processing to the data,” Hillion says. By doing this, “the obstacles of moving data, restricting yourself to samples, deploying models, converting code, and setting up a separate environment go away.” It’s an approach that he describes as being far more iterative and agile. “You can constantly iterate and refine your model,” Hillion says, “see how it works, refine it, deploy it, see what happens when new data comes in, refine it again.” All the while, “you’re learning from your results,” he explains.

Today’s announcement solidifies and extends the close partnership between Greenplum and Alpine Data Labs that has existed since the company’s inception. The Greenplum database was in fact Alpine’s first platform, a reflection of Greenplum’s early innovations in database analytics. The Alpine founders recognized that the advanced capabilities of the platform meant that many of Greenplum’s customers wanted to use the database “for much more than just B.I. reports and dashboards and so on,” Hillion says.

Hillion points to one of Alpine’s clients, Zions Bancorporation, a U.S. bank, as an example of the impact of such integration in a practical environment. “We took low-level data and came up with some models for how loans reacted to changing economic conditions,” he says. “We were able to ask, ‘how would a loan portfolio stand up if, say, unemployment was to go up again?’ It allowed the bank to build these models at a much greater level of accuracy than they had done before, and rebuild the models on a continual basis to meet the latest regulatory requirements and learn more about their loan portfolios.”

As many of Greenplum’s customers embraced Alpine’s solution, both companies worked closely together to ensure its smooth implementation with Greenplum’s products. The new partnership between Greenplum and Alpine Data Labs further reduces the barriers to entry.

“When you get Greenplum,” Hillion says, “Alpine will be already there. There’s now no excuse for not making the maximum value out of your data. I think that some people are still leery of these predictive projects, or just assume they’re going to take months, and we want to make them as easy as possible. You can get an analytics environment spun up quickly using the infrastructure and data you already have.”

The ultimate goal is to reduce friction, error, and deliver the value of predictive analytics accessibly and quickly. “It’s all about not treating advanced analytics as this big scary thing requiring PhD’s and a big cumbersome architecture,” Hillion says. Instead, the goal is to allow enterprises to “look at an existing business problem and get results this week.”

About the Author


More Content by Paul M. Davis
"Data Is" or "Data Are"?
"Data Is" or "Data Are"?

“Data is” or “data are”? It’s the type of sticky linguistic thicket that invites vociferous debate. Working...

Tracker now on CloudFront
Tracker now on CloudFront

For the 3rd time this week, we experienced more CDN issues this morning, causing images and CSS to not load...

Enter curious. Exit smarter.

Register Now