Alpine Data Labs lowers the barriers for entry for enterprises wanting to realize insights from predictive analytics. For companies lacking the necessary infrastructure or data science skills, Alpine Data Labs offers platforms and services that allow companies to perform sophisticated modeling of the data available. Two of the major bottlenecks for these companies arise from the process of moving the data into a local environment, and having the sufficient hardware to test models against terabytes of data. To solve these intermediary challenges, Alpine Data Labs turned to Pivotal Analytics Workbench, a 1000-node Hadoop cluster offering free access for 90-day periods of time.
In an interview with Cindy Waxer at DataInformed, Steven Hillion, Chief Product Officer at Alpine Data Labs, explains that “For a small company like ours, it’s difficult to get your hands on a very large Hadoop cluster to test and see if our models can scale for very large datasets…Pivotal AWB has been great in helping us test out our software before it gets into production at our customer sites on their live data sets.”
The use cases for a sandbox Hadoop environment such as Pivotal Analytics Workbench are numerous. As Alpine Data Labs has done, the effectiveness and efficiency of models can be tested against “billions of rows of data and hundreds of columns” without relying upon limited existing resources. Small organizations that lack the infrastructure to power a large Hadoop cluster can still perform predictive analytics against their datasets. The service is also ideal for short or limited analytics engagements.
To learn about how Alpine Data Labs is using Pivotal Analytics Workbench, read more at DataInformed.
About the Author
BiographyMore Content by Paul M. Davis