Serverlessing like a Pro!
Cover Image for serverless created using Microsoft Designer

Serverlessing like a Pro!

If you are starting off your cloud journey or already have embarked on one, you must have come across the term, “Serverless”!

But what does it mean? It sounds like it doesn’t involve any servers — but in reality they do, it is just the way how multiple cloud providers have encapsulated those bits so that developers can only focus on their code and don’t have to worry about the underlying infrastructure!

You basically write focussed chunks of code called functions, which are event-driven that means — they can be called or invoked when an event happens such as an API call, data updates, etc.

So let’s explore creating serverless functions in all three major cloud platforms — GCP, Azure and AWS!

We will create HTTP triggered serverless functions in all 3 platforms and in 3 different frameworks. An HTTP trigger means that the function can be invoked when we call it via its url i.e. make an HTTP request.

We will create these functions locally, run them and deploy them via CLI. Here’s the GitHub repository with all the demos we will create!


GCP Cloud Function

Let’s start with creating a Cloud Function in GCP! Before you do that, you obviously need to have access to GCP console, install Google Cloud CLI and enable Cloud Build API, Artifact Registry API and Cloud Functions API in the console. You can follow their documentation to complete all the steps.

Instead of cloning the repo in the documentation, we will initialise our new project.

Go to your desired directory and create a folder called serverless-demos. Within that create another folder called gcp-demo. Next go to that folder and do:

npm init        

Keep all the settings as default and now you have initialised your node js project. Open it in your preferred editor and create an index.js file within that as that is your entry point.

We want to start with a very simple scenario by creating a POST function that basically takes your name as input and returns a string concatenated with that name. Here’s the code for the same:

const functions = require('@google-cloud/functions-framework');

// Register an HTTP function with the Functions Framework that will be executed
// when you make an HTTP request to the deployed function's endpoint.
functions.http('helloYou', (req, res) => {
    // Extract the name from the request body or query parameter.
    let name = req.body.name || req.query.name || 'There';
    res.send('Hello ' + name + '!');
});        

Now to run the function locally, you need to add a start script in your package.json :

"scripts": {
  "start":"functions-framework --target=helloYou --signature-type=http",
  "test": "echo \"Error: no test specified\" && exit 1"
}        
Pro Tip: For hot-reloading i.e. automatically restarting the node application when any changes are made, you can add another command to your script:
"dev": "nodemon - exec 'functions-framework - target=helloYou - signature-type=http'",        

Now when you run,

npm start        

This will run the function locally on port 8080, and when calling it with your name, you will get the desired response looking like this:

Google Cloud Function running locally

You can run it in the dev-mode, make changes and play around more! When you are ready to deploy, make sure you have initialised and installed the gCloud CLI.

You can then use the gcloud functions deploy command to deploy your cloud function:

gcloud functions deploy helloYou \
--gen2 \  
--region=australia-southeast2 \
--runtime=nodejs18 \
--trigger-http        

Once deployed you will get a URL that you can use to invoke your function :) You can also see your new function in the GCP console with all the details and logging!!

Done, too easy! ?


Azure Function App

While developing locally via terminal is fun, using code editor extensions can help you accelerate your development and hence for the sake of this example, we will use VS Code and the Azure Functions Extension.

Again, assuming you already have access to Azure and have an active subscription, you’ll also need to have Azure Functions Core Tools installed, C# extension and .NET CLI installed — since we will be deploying a function app in C#. You can find the prerequisites and instructions here. Don’t forget to install the Azure Functions Extension.

Once you have done that, all you need to do is create another folder in your serverless-demos directory, called azure-demo .

Next using the command Palette (Ctrl/Command+Shift+P), search for the Azure Functions: Create Function.. and select. This will help you create your function step by step, specify the azure-demo folder, C# as the language, .Net Runtime (I have .Net 6.0) and HTTP Trigger template. Call your function as helloYou for consistency. You can provide the default namespace or anything you like and allow anonymous access.

This will create a new Azure function project in C#. The HTTP-triggered function is all ready. You can update the response message to replicate the behaviour to the one in GCP i.e. show “Hello World” when no name available either in request body or query parameter, else show “Hello whatever_name_you_provide”.

Your helloYou.cs might look something like this:

using System;
using System.IO;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;

namespace hello.you
{
    public static class helloYou
    {
        [FunctionName("helloYou")]
        public static async Task<IActionResult> Run(
            [HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)] HttpRequest req,
            ILogger log)
        {
            log.LogInformation("C# HTTP trigger function processed a request.");

            string name = req.Query["name"];

            string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
            dynamic data = JsonConvert.DeserializeObject(requestBody);
            name = name ?? data?.name;

            string responseMessage = string.IsNullOrEmpty(name)
                ? "Hello World"
                : $"Hello {name}";

            return new OkObjectResult(responseMessage);
        }
    }
}        

You can run the function by simply going to Run in VS Code and it will run locally.

Pro Tip: If you are considering future development projects too with Azure, you may be better off with Visual Studio as it has better support for C# and Azure development and also option for hot reload.

But if you wish to hot reload your changes, you can try to add the following to your .csproj file that got created with the function.

<Project>
...
  <Target Name="RunFunctions">
    <Exec Command="func start" />
  </Target>
...
</Project>        

And then run the following command in the terminal:

dotnet watch msbuild /t:RunFunctions        

Takes a bit but reloads the function as you make changes. Here’s what the output looks like:

Azure Function App running locally

Now that we have finalised it, let’s deploy it to Azure using the Azure CLI commands. Make sure you have installed the Azure CLI.

Now, it is a bit different in Azure world when it comes to deploying the function that you created locally. You basically create a resource group first that is like a logical container to group related resources together. Next up every function needs some memory to run and hence you create a storage account as well. So let’s do these:

First login:

az login        

Next, create a resource group

az group create --name StartWithAzure --location australiasoutheast        

You can list the locations and select the closer ones to yourselves using:

az account list-locations        

Next, create a storage account, the name must be globally unique and hence you may use your name or anything to make it unique.

az storage account create --name loveefunctionstorage --location australiasoutheast --resource-group StartWithAzure --sku Standard_LRS --allow-blob-public-access false        

Next, create a Function App in Azure on which we will deploy the function we created earlier:

az functionapp create --resource-group StartWithAzure --consumption-plan-location australiasoutheast --runtime dotnet --functions-version 4 --name helloYou --storage-account loveefunctionstorage        

This also creates Application Insights for you to be able to view logs and metrics.

Finally, let’s deploy the function:

func azure functionapp publish helloYou        

Once deployed you’ll get a URL to invoke the function and you can also view the resources you created and the function in Azure Portal :)


AWS Lambda Function

Developing and deploying AWS Lambda Functions shouldn’t be too different than the others, but you do need the following before you start:

  • Obviously an active AWS account.
  • AWS CLI and SAM CLI installed and configured
  • Docker Desktop used to run and test functions locally

Also, we will be creating the lambda in python — so make sure you have python installed as well.

You need to be logged in to your AWS account in the cli:

aws configure        

Back to our repo serverless-demos, let’s create let’s open terminal and do:

sam init        

This will take you through a wizard, so select the following:

  • AWS Quick Start Templates
  • Select Hello World Example
  • Python runtime with
  • No X-ray Tracing
  • Yes for CloudWatch Insights
  • Set the project name as aws-demo

This will create the sample project with a HelloWorldFunction. It will also create a template.yaml which basically is the Infrastructure-as-code template for your function and the resources that will be needed to deploy it.

The function code would be within the hello-world/ directory in the app.py . The function simply returns the string “hello world” upon invocation. Let’s change it so that it returns “Hello your_name” if available either in request body or query parameters else returns “Hello World”.

Replace the part from return statement with the following:

name = event["queryStringParameters"]["name"] if event["queryStringParameters"] is not None and "name" in event["queryStringParameters"] else None
if name is None:
    body = json.loads(event["body"]) if event["body"] is not None else None
    name = body["name"] if body is not None and "name" in body else "World"
return {
    "statusCode": 200,
    "body": json.dumps({
        "message": "Hello " + name
    }),
}        

Now there are two ways to invoke this function. Let’s checkout the first one.

You can invoke the function once and while you do that, you also provide an event defined in event.json in the events/ folder. You will already have event.json populated with sample input to your function and you can update it so that you pass in the right parameters. Or you can simply create a new myevent.json with the following:

{
    "body": "{\"name\": \"Lovee\"}",
    "queryStringParameters": {
        "name": "You"
    }
}        

cd into the directory aws-demo and then call:

sam local invoke "HelloWorldFunction" -e events/myevent.json;                   

What this will do is, invoke your function once and give you an output in the terminal like the following:

Invoking Lambda Function once locally with myevent.json
Pro tip: Predefined events are good when you have different trigger than HTTP to invoke your function. For HTTP-triggered functions I prefer running them locally in a way that they hot-reload so that I can call them via Postman. Let’s try to do that now.

For hot-reloading and calling the function as many times as you like, you can use the following command:

sam local start-api        

This will give you a url to invoke your function. Just call it with a name in either request body or query params and it should work. You can make changes and it will automatically reload. Here’s the output:

Lambda Function running locally with name in request_body

Now that we are happy with our lambda, it is time to deploy it! You can run:

sam validate        

which validates the template.yaml and then run:

sam deploy --guided        

Now this will let you go through a series of questions where in you provide the stack name, region, etc. It will also create an IAM Role and we will allow the function without authentication. Once ready, it will save this configuration to a .toml file. And then create a changeset with all the required resources to deploy.

If you get any errors with the AWS account, you can reconfigure it and rerun the following command to continue using the config file you created earlier for the deployment.

sam deploy --config-file samconfig.toml        

Once deployed you’ll get the URL to invoke the lambda function and you can also view the resources you created in the AWS console. :)


All done! Kudos on creating and deploying serverless functions in three different platforms, three different runtimes and three different languages! ??

Just an FYI, you will find VS code extensions of all 3 cloud platforms pretty handy. They can help accelerate your development — so definitely give them a go!

And don’t forget to checkout the GitHub repository with all the code.


Esha Gulyani Solanki

Cloud Solutions Architect at Microsoft | Community Leader | Author | Speaker | Mum

1 年

Nice one Lovee Jain ????

Sindhu Nachiraju

Sr. Analyst Engineer(Developer) | Solution Designer at NAB

1 年

loved your article, so informative!

Marek Petras

Lead Architect at Prezzee

1 年

Well done Lovee, good read, well written :)

要查看或添加评论,请登录

Lovee Jain的更多文章

  • BFFs could be your new Best Friends Forever!

    BFFs could be your new Best Friends Forever!

    Confused? Well, I was too when I first heard from my manager that let's work on BFF this quarter. That day he often…

    2 条评论

社区洞察

其他会员也浏览了