PaaS vs FaaS? Which should I run my microservices on?
We all know that microservices are distributed processes that must be independently releasable, deployable and scalable. At first glance Platform-as-a-Service (PaaS) and Function-as-a-Service (FaaS) a.k.a. Serverless seem to be capable of bringing into the world the previous description. Both cloud computing models are also able to support incredibly short lead time during software development which promotes innovation and continuous experimentation.
However when we delve deeper into their technical details, we soon realise they are not always suitable to the same use cases.
PaaS
Platform-as-a-Service?is the cloud model where you provide your source code and the platform will package, release, provision, deploy, run, monitor and scale out/in your microservices. The best examples I can think of are Cloud Foundry, Heroku and Google App Engine.
Your app will always have at least one instance running on PaaS. This comes in handy where persistent connections are required for implementing push notifications via SSE (Server-Sent-Events), Websockets or RSocket. There are many other benefits as well, for example promptly processing incoming requests, keeping reference data in memory (a.k.a. in-process data cache), implementing circuit breaker pattern for handling partial failures, or leveraging connection pooling for throttling workload and reducing response time.
FaaS
Function-as-a-Service implies a computing model where your code will be packaged by the platform and run on demand as a result of some configurable event (e.g. HTTP request, message arrival, file upload) for a limited amount of time and may be disposed of at any time afterwards. Here good representatives are AWS Lambda, Azure Functions and Google Cloud Functions.
We can assemble our apps out of multiple functions but each one needs to be configured and deployed individually. That's why FaaS is sometimes called Nanoservices. This computing model has some interesting consequences:
Consider the chart below comparing lines of code between projects implemented using the Serverless framework (Lambda + API Gateway) against a project implemented in pure Node.js. For each non-trivial route (piece of functionality) added to a?software system, the number of lines of configuration code needed to maintain the?project grows at a steep linear rate when using a serverless architecture.?Someone has to maintain all of that... In short, for a?short term gain, serverless architectures mortgage the?future.
source: https://twintechinnovations.wordpress.com/author/twintechinnovations
source: https://www.bbva.com/en/economics-of-serverless/
领英推荐
N.B. Please note the above graphics compare Lambdas with heavyweight VMs, not PaaS which have been using containers long before Kubernetes came into existence. Taking Cloud Foundry as reference, I have seen customers running 20+ different microservices per VM in highly regulated and very stringent production environments. That means the previous breakeven would happen at 5 requests per second on average for platforms underpinned by m4.4xlarge VMs.
The AWS Lambda function needs almost double the amount of time to write and read data from/to S3.
Developer experience
I have been witnessing colleagues advocating for and some companies moving wholesale into FaaS as a means to avoid the painful process of building and maintaining a large number of container images as well as orchestrating them across various environments.
I couldn’t agree more with the much needed idea of abstracting away from developers the burden of managing infrastructure. However we have seen that both PaaS and FaaS are able to handle the undifferentiated heavy lifting on behalf of developers, including packaging, deploying, and auto scaling applications as well as managing security, routing and log aggregation.?
There is no need to adopt FaaS just to avoid the complexity caused by running containers at scale.
If your goal is to solely improve developer experience, you might well find that PaaS fits your needs with lower complexity and in a less intrusive way when compared to FaaS. I believe that this idea is behind the growing adoption of the Digital Platform model.
A digital platform is a foundation of self-service APIs, tools, services, knowledge and support which are arranged as a compelling internal product. Autonomous delivery teams can make use of the platform to deliver product features at a higher pace, with reduced coordination.
Wrapping up
Now that the hype on Serverless seems to be winding down (one can check here and here), I reckon that each model has its pros and cons and both are here to stay. As always there seems to be no one-size-fits-all solution for the challenges faced when moving our workloads to the cloud. A hybrid approach might help us to reap the best outcomes.??
My stance is currently as follows:
I would advise to take in consideration your own requirements and circumstances before jumping on some bandwagon, whatever it is, perhaps even making some experiments which is one of the greatest benefits offered by these both cloud computing models.
Msc in Modelling for Global Health |University of Oxford | Mastercard Foundation-AfOx Scholar /Datascience/ Women Techmakers Ambassador / CFG Data degree'22
3 年This is so helpful