Software development has gone through a radical shift over the last 15 years. First, agile methodologies completely upended traditional software development practices, allowing developers to write and release code much more frequently. But what good is that if you can’t operationalize all that great software? Not much. So around 2008, Pivotal’s own Andrew Clay Shafer spearheaded the DevOps movement, a new approach to the software development lifecycle that makes developing software and operating it a shared responsibility. The result is continuous software development, deployment, monitoring and improvement.
Unfortunately, the world of data analytics wasn’t included in the the revolution. In many enterprises today, it takes weeks, even months to get analytics and data warehouse projects off the ground and into production. This raises the question: Does data need its own DevOps moment? In this episode of Pivotal Insights, host Jeff Kelly speaks with Elisabeth Hendrickson, head of R&D for Pivotal’s data portfolio, about how agile and DevOps might be applied to data analytics, what it would mean from a people and process perspective, and how data analytics technologies would need to evolve to support it.
Visit http://pivotal.io/podcasts for show notes and other episodes.
- Download the episode and check us out on SoundCloud, subscribe to the feed directly, or on iTunes to have it automatically downloaded for you.
- Twitter: @jeffreyfkelly and @testobsessed
- Feedback: firstname.lastname@example.org
- 5 Observations From Adopting Agile at Pivotal Greenplum
- The History of DevOps, Part 1
- When It Comes To Big Data, Cloud And Agility Go Hand-in-Hand
- Transforming Your Company into a Data Science-Driven Enterprise
About the AuthorFollow on Twitter Follow on Linkedin