FaaS.local - The Benefits of On-Premises FaaS

March 14, 2018 Dan Baskette

Before I delve into the benefits of running a FaaS on premises, I need to acknowledge something. That is, this might be the first blog post in the history of serverless and Function-as-a-Service (FaaS) that does not belabor the obvious point that these architectures indeed use servers. An in-depth discussion of whether or not Jar Jar Binks was a Sith Lord would undoubtedly be a more productive use of time. Unfortunately, that’s not the topic at hand. (Answer, btw: Of course he was)

FaaS has seen abundant levels of coverage in the tech press. This programming model is well on its way to becoming a useful developer abstraction. But what about companies that haven’t gone all in on cloud computing yet?  (Despite what you might have heard, there are still a large number of enterprise shops that just haven’t made the plunge.) Are the cloud-provider FaaS implementations purely a cost-reduction play, or are there useful application patterns that this technology can be applied to regardless of where it’s hosted? There seems to be some agreement in the industry (minus one particular company) that there are indeed some benefits to this paradigm that make it valuable within the four walls of your own data center.    

Developer Efficiency and Scaling

One of the unique value props for FaaS in the public cloud is the pay-per-use model. Utility-style pricing may not occur in the on-premises enterprise datacenter. But there are still cost savings to be gained from an on-premises deployment.  When developers can leverage a FaaS implementation that provides embedded support for event streaming platforms, such as RabbitMQ or Kafka, they no longer have to mess with creating and maintaining connections.  This simplifies the development process and can make developers more productive. Sure, this is more of a soft-cost benefit. But it’s a benefit nonetheless.

In addition, the automatic 0-n scaling of functions reduces the operational overhead of these environments. How? By enabling more efficient workload consolidation into the same cluster in your data center. We’ve moved from physical servers to virtual, then from virtual to containers. Now, we’re using functions to scale containers according to demand. No more  pre-provisioning!

Project Riff and the forthcoming commercially supported  Pivotal Function Service (PFS), provides this abstraction atop Kubernetes. It takes a function the developer has written, and builds a container to run the function. It then scales the function as load increases.

riff uses a Kubernetes pod sidecar to connect to the event broker, sending and receiving events processed by the function container. This sidecar pattern features another container in the same pod that enhances the functions of the primary container. This frees function containers  from connnecting directly to message brokers. They only need to worry about how to process events once. This reduces the code required to provide the needed functionality and, in turn, provides higher developer productivity.

Improved Quality: Less Code, Fewer Bugs

I met with a customer recently. He said “We should be writing as little code as possible. Less code means fewer bugs.” This statement is why platforms like  Pivotal Cloud Foundry have become so popular. These tools abstract away the complexity of application packaging and deployment. A modern platform can connect your code, using service brokers, to a range of add-on capabilities.  This can also apply to FaaS; this model further abstracts away complexity when processing events.

FaaS enables the developer to focus on the event processing logic. Engineers don’t have to worry about integrating with all kinds of systems to obtain a single event or an event stream.  

FaaS For Your On-Premises Data Flows

Consider the example of a data flow. Every enterprise has some number of data flows they use to get data from a source to a “sink” (persistence action, not the stainless steel one in your kitchen).  These dataflows have a variety of use cases, from ETL to IoT. But they all have one thing in common -- they process/munge/filter/etc. the data as it flows. At Pivotal, we created Spring Cloud Data Flow (SCDF) for this use case. SCDF is a toolkit for building data integration and real-time data processing pipelines.  

Today, development teams use SCDF and Spring Boot apps to process data pipelines. (And it works!)  It’s easy to imagine how a FaaS-based architecture would do these jobs more efficiently. A much simpler programming model - and 0-n - scaling could make pipelines less rigid and more performant.

Perhaps this would signal the end of old-style batch-ETL processing, and enable a much more agile approach. As Big Data practitioners know, once a huge volume of data lands,  it’s hard to pick up and move it. If you generate large amounts of log and metrics data (and you store them on-prem), it makes so much sense to use FaaS to improve data processing times.

Avoiding Cloud Lock-In

Finally, one obvious benefit to using an on-premises FaaS implementation is avoiding the infamous cloud lock-in. With something like Amazon Lambda (or Azure Functions, or Google Cloud Functions), it’s not necessarily the FaaS platform itself that causes lock-in. It’s more about the surrounding application services that are required to make it work. Often, you need to commit to the larger cloud provider’s ecosystem to get the true value of the functions service. When deploying a cloud-agnostic FaaS, the event delivery and long-term storage options are more flexible, and can align with what is already in use within the enterprise.

Despite the lack of charge-per-use model for on-premises FaaS implementations, there are still efficiency benefits to be gained from on-premises FaaS. In fact, if the only benefit of  cloud-based versions of FaaS was cost avoidance, this new paradigm would not have seen the surge of interest that it has. While I haven’t covered all the benefits to deploying a on-premises FaaS within your data center (there are many more), I have tried to focus on a few concrete examples that exist in today's enterprise environments.

Know of some benefits to on-premises FaaS that I missed? Add them in the comments below!

 

About the Author

Dan Baskette

Dan is Director of Technical Marketing at Pivotal with over 20 years of experience in various pre-sales and engineering roles with Sun Microsystems, EMC Corporation, and Pivotal Software. In addition to his technical marketing duties, Dan is frequently called upon to roll-up his sleeves for various "Will this work?" type projects. Dan is an avid collector of Marvel Comics gear and you can usually find him wearing a Marvel shirt. In his spare time, Dan enjoys playing tennis and hiking in the Smoky Mountains.

Follow on Twitter More Content by Dan Baskette
Previous
Messing with a Classic
Messing with a Classic

Howard Roffman, Former EVP of Lucasfilm talks about transformation for the Star Wars Franchise: the gap bet...

Next
Using VMware’s Harbor with PKS (and Why Kubernetes Needs a Container Registry)
Using VMware’s Harbor with PKS (and Why Kubernetes Needs a Container Registry)

Container registries add important security features to Kubernetes. This post details how Harbor, part of P...

×

Subscribe to our Newsletter

!
Thank you!
Error - something went wrong!