How to Deploy Microservices Using Serverless Architecture?
Abid Anjum
Senior Full Stack Developer Architect | Java Spring Boot Microservices | Angular & React | Mobile Apps Engineer Android IOS Flutter | Asp.net Core C# | BI Expert | GIS Expertise
Monoliths vs. Microservices
Whereas monolithic applications are built and deployed as one holistic unit, microservice-based applications consist of multiple small services that can be scaled and managed independently of one another. For example, an e-commerce application may include different microservices for the product catalogue, checkout workflow, and shipping process, and each microservice may use its own language, database, and libraries. This design paradigm increases an application’s resilience, as an error in one service won’t necessarily affect others. Microservices can be hosted on containers, virtual machines, on-premise servers, or serverless functions.
Defining Microservices and Serverless Architecture
Microservice architecture involves splitting up monolithic applications into a set of independent, loosely coupled services that can be independently deployed, owned and managed by small teams, and organized around business capabilities. The microservices architecture allows the frequent, rapid and reliable delivery of complex applications.
By associating microservices to fulfill certain tasks, a business can gain agility and greater capabilities to change or update a microservice as needed. Additionally, it can be used to eliminate or replace services without negatively impacting the overall workflow of the application.
When we rely on one service instance, a failure of that instance will have a catastrophic effect on the composite. With the increase in the use of cloud services and platforms, the traditional dependency on the physical platform is no longer ideal.
With the advent of orchestration tools, developers can provision microservices without having to understand underlying attributes. These orchestration systems also provide abilities to use the metadata that defines the use of resources. But that doesn’t deal with all issues related to microservices.
To eliminate the problems in microservices requires complete abstraction from physical attribution, and that’s where Serverless architecture comes in.
Serverless (Function-as-a-Service) is a cloud-native model allowing developers to build and deploy applications without managing the servers. Serverless doesn’t mean that there are no servers. There are servers but these servers are abstracted away from the development of the application. In serverless, the cloud provider handles the provisioning, scaling and maintaining the infrastructure. All the developers have to do is package their code in containers for deployment. In Serverless, apps can execute parts of their code on demand and invoke functions by using event triggers.
The Serverless Approach and Its Benefits for Microservices
The serverless concept refers to the automation of all services on a platform. It not only offers highly responsive resource flexing but also event-driven interactions. Using serverless architecture, microservices are deployed in the cloud environment offered by cloud service providers. The cloud providers use virtual machines or containers to isolate the services. Events such as API requests or file upload trigger code and when the assigned task is completed, the server goes idle until triggered again.
Development teams can run multiple instances of similar services across different data centers with serverless architecture. A serverless service is independent of the storage, the network or the CPU.
Serverless Deployment Provider
There are different serverless deployment environments like AWS Lambda, Azure Functions, and Google Cloud Functions that can be used by developers for their microservices applications. Although each one of them has similar functionalities, AWS Lambda provides the richest set of features that we will discuss in a matter of time.
Use Cases of Serverless Microservices
Serverless microservices include the advantages of serverless architecture including improved cost-efficiency, less overhead. But its primary goal is to help you in combining managed services with serverless functions.
It allows you to easily integrate message queues, databases, API management tools with functions. Therefore, once you have used resources and functions in a microservice, they can be used as the basis of other microservices.
Serverless microservices are the best fit for complex and modern applications so that they can be easily managed and scaled. Further, if you can divide your app into small, independent services that can be converted into event-driven, short-running tasks, you can use serverless microservices too.
However, if your application receives continuous load and runs tasks that run for long then they might be better as monolithic applications.
Serverless microservice using AWS Lambda
AWS Lambda is securely integrated with API Gateway. The capability of making asynchronous calls from API Gateway to Lambda allows the creation of serverless microservice-based applications.
But before deploying microservices in AWS Lambda it is necessary to package the Java, NodeJS or Python code of the application’s service in a ZIP file and upload it to AWS Lambda. You can easily name functions that handle resource and event limits.
Once an event is triggered, AWS Lambda looks for an idle function instance, launches it and invokes the handler function. It runs only those many instances that are needed to handle the load.
Underlyingly, Lambda functions utilize containers for the isolation of every instance and run those containers on EC2 instances.
领英推荐
Creation of an object in an S3 bucket
Creation, update or deletion of an item in a DynamoDB table
There’s an unread message from a Kinesis stream
An email was received through Simple email.
You can also configure the AWS Lambda Gateway for routing HTTP requests to your Lambda and invoke its function. This HTTP request is then converted into an event object by AWS Gateway to invoke the lambda function and create an HTTP response based on the result from the Lambda function.
Another way to invoke the lambda function is to use AWS Lambda Web Service API. Lambda functions are invoked using a JSON object passed to the function by your application and returned by the web service.
The last way in which you can invoke a lambda function is by using a cron-like mechanism where you tell AWS to invoke the lambda function in a definite amount of time.
For each function invocation duration, the charges will be defined. Usually, the invocation duration is measured in total memory consumption and 100 ms increments.
Challenges of Serverless Microservices
Some of the challenges of serverless deployment of microservices are:
Significant limitations and constraints
Using a serverless environment for the deployment of microservices often has more constraints than in Container-based or WM-based infrastructures. For instance, AWS Lambda has support for only a few languages and is only suitable for stateless applications deployment that runs in reaction to a request. It will not allow you to deploy a message broker or database which are long-running stateful apps.
Applications Often Don’t Startup Quickly
If you want your app to start quickly then serverless deployment may not be an ideal choice.
Input Resources are Limited
AWS Lambda only responds to requests coming from limited input sources. It is not intended to run services like subscribing to a message broker such as RabbitMQ, Apache MQ, etc.
Risk of high latency
Time taken to provision or initialize a function instance for the infrastructure might result in high latency. Besides, the serverless environment only responds to load increases. The capacity cannot be pre-provisioned, which often results in a latency of your application in case of a sudden and immense spike in load.
Monitoring Serverless Microservices
To monitor serverless microservices, there are numerous tools available on the market, namely:
Reimann uses stream processing language for aggregation of application or host events to send email notifications for exceptions and also track latency distribution of the app.
Datadog offers a single glass pane for microservices dependencies monitoring. With its serverless monitoring, you can analyze different functions and resources that are invoked by them.
LightStep is another tool for monitoring serverless microservices. It allows development teams to identify and resolve regressions, despite the scale and complexity of the application.