Build A Serverless Order Processing E-commerce Microservice
Author image

Build A Serverless Order Processing E-commerce Microservice

?? Hello there! Welcome to The Serverless Spotlight!

In this week's Serverless Challenge Edition, I’ll guide you through building out a serverless microservice that will be able to process orders for an e-commerce type application at scale.

There are quite a few pieces involved to accomplish this but I’ll make it super simple to understand and reproduce for yourself.

So strap in and let’s get started!

Overview

The general overview of this microservice looks like this:

When a customer places an order through a Lambda function URL, the request is routed to a Lambda function which sends a message with the order details to an SQS queue.

Asynchronously, another Lambda function will be triggered whenever a new message is sent to SQS. The function will process the message which contains the order details and will write them to a DynamoDB table as well as write an order receipt as a file in Amazon S3.

The function then sends a notification to an SNS topic, which triggers an email alert for a new order.

Let’s get started building this out through the AWS console.

Implementation

DynamoDB

The first thing we need to create is a DynamoDB table. Our DynamoDB table will handle writes and reads of orders at a large scale.

Log into your AWS account and navigate to DynamoDB.

Create a new table and use the following configurations:

  • Name the table “orders”
  • set the partition key as “orderID” (we don’t need a sort key)
  • use provisioned capacity or on-demand if you do not know your traffic patterns in advance

Creating a new DynamoDB table

Create the table.

That’s all for DynamoDB.

SQS

Let’s now create an SQS queue to handle order requests queuing.

SQS will allow our Lambda function code to execute order processing asynchronously by accepting order jobs from the SQS queue, making our microservice infinitely more scalable.

Navigate to the SQS console in AWS.

Create a Queue.

Use the following configurations:

  • Use standard for the queue type
  • Name the queue “OrderProcessingQueue”
  • Use the rest of the default configurations (keep settings like 4 day message retention and max size of message to 256kbs)

Creating a queue in SQS

Create the Queue.

SNS

We’ll use SNS to send email notifications to order creation recipients every time an order is created.

A Lambda function - which we will create below - will be responsible for triggering these emails on an order received event.

Navigate to the SNS console.

Here, you can create a new SNS Topic.

  • Use the Standard type.
  • Name it NewOrderTopic.

Creating a topic in SNS

You can now create the topic.

Scrolling down a little bit on the page of the new topic you just created, click on the Create subscription button.

On the create subscription page, in the details section you will see the Topic ARN is pre-filled. For the Protocol select Email from the dropdown to be able to send emails when the SNS topic is triggered.

For the Endpoint, enter your email address in the input.

Create subscription in SNS

Like you see in the popup notification, you must confirm the email after you create the topic.

Create the subscription and wait for the email from AWS Notifications and confirm that you want to subscribe to this topic.

When you confirm the subscription in your email, you will be redirected to a browser tab and will be able to see a message that the subscription has been confirmed.

Let's now head over to Lambda to write the server code for processing orders.

Lambda Function For Order Dispatching

In the Lambda console, create a new function with the following configurations:

  • Select Author from Scratch
  • Name the function OrderDispatcher
  • Choose the Node JS 20.x runtime option
  • Under Permissions, choose Use an existing role and create a new role in IAM with access permissions to SQS and CloudWatch (to do this follow this quick and easy guide )

Create new Lambda function

Create the function.

Scroll a little down to find the Code section. In the text editor, copy and paste the following code:

import { SQSClient, SendMessageCommand } from "@aws-sdk/client-sqs";

const sqs = new SQSClient();

exports.handler = async (event) => {
  const { orderID, customerEmail, orderDetails } = JSON.parse(event.body);

  // Send message to SQS
  const sqsCommand = new SendMessageCommand({
    QueueUrl: process.env.ORDER_QUEUE_URL,
    MessageBody: JSON.stringify({ orderID, customerEmail, orderDetails }),
  });

  await sqs.send(sqsCommand);

  return {
    statusCode: 200,
    body: JSON.stringify({ message: "Order received and processing started!" }),
  };
};        

The Lambda function code imports the AWS SQS client library.

From the event variable, we extract the orderID, customerEmail, and orderDetails and send these to an SQS message queue.

With that done, we need to set the environment variables which we defined in the function code.

Scroll down a little and you’ll find the Configurations section. Then on the left side menu, select the Environement Variables section.

Click on Edit.

Add the environment variable for the SQS queue env var.

Key: ORDER_QUEUE_URL
Value: the SQS queue URL (can be found in your SQS console, in the queue details which we created earlier)        
Adding env vars to Lambda function

Save the changes and return to the Lambda function page.

We can now deploy the code by clicking on the Deploy button at the top of the text editor.

Once the function is deployed we need to do change the file extension of the index.mjs file to index.js. This can be done by renaming the file in the left hand sidebar and changing the extension to .js.

changing lambda function index.mjs to index.js

We now need to attach an SQS trigger to this function.

The SQS trigger will allow our Lambda function to get SQS message events and process them in our Lambda function.

Scrolling up to the top of the function page, you will find a button add trigger. Click on it.

Add trigger to OrderProcessor Lambda function

On the Add Trigger page, for the source dropdown, find and select SQS from the list.

For the SQS queue add the OrderProcessingQueue we created earlier.

Add an SQS queue trigger to Lambda function

Click on save to add the trigger.

Lambda Function For Order Processing

We now need to create the function that will handle the main order creation and record the details.

Create a new Lambda function with the following configurations:

  • Choose the Author from scratch option
  • Name the function OrderProcessor
  • Select the Node Js 20.x runtime
  • Under permissions, create a new IAM role with permissions to DynamoDB, SNS, S3 and CloudWatch using this guide .

Create the function.

Scroll down to the Code section and copy and paste the following code.

import { DynamoDBClient, PutItemCommand } from "@aws-sdk/client-dynamodb";
import { SNSClient, PublishCommand } from "@aws-sdk/client-sns";
import { S3Client, PutObjectCommand } from "@aws-sdk/client-s3";

const dynamoDb = new DynamoDBClient();
const sns = new SNSClient();
const s3 = new S3Client();

exports.handler = async (event) => {
  
  const QueueUrl = process.env.ORDER_QUEUE_URL;
  const TopicArn = process.env.NEW_ORDER_TOPIC_ARN;
  const s3Bucket = process.env.S3_BUCKET_NAME;

  for (const record of event.Records) {
    const { orderID, customerEmail, orderDetails } = JSON.parse(record.body);

    // Save order to DynamoDB
    const putItemCommand = new PutItemCommand({
      TableName: ‘orders’,
      Item: {
        orderID: { S: orderID },
        orderDetails: { S: JSON.stringify(orderDetails) },
        customerEmail: { S: customerEmail },
      },
    });
    await dynamoDb.send(putItemCommand);

    // Publish to SNS
    const snsCommand = new PublishCommand({
      TopicArn,
      Message: `New order received: ${orderID}`,
      Subject: "New Order Notification",
    });
    await sns.send(snsCommand);

    // Store order receipt in S3
    const receipt = {
      orderID
      customerEmail,
      orderDetails,
      timestamp: new Date().toISOString(),
    };

    const putObjectCommand = new PutObjectCommand({
      Bucket: s3Bucket,
      Key: `${orderID}.json`,
      Body: JSON.stringify(receipt),
      ContentType: "application/json",
    });
    await s3.send(putObjectCommand);
  }

  return {
    statusCode: 200,
    body: JSON.stringify({ message: "Orders processed successfully!" }),
  };
};        

The code above imports the client libraries for DynamoDB, SNS and S3.

The function then intercepts an order event and extracts the order variables that were sent to SQS.

With these order detail variables we:

  • Write an order item to our DynamoDB table.
  • Publish a notification to SNS to send an email to the customer with the order details
  • Create a receipt file to S3

Finally, we return a success message that the order was processed successfully.

Like with the first Lambda function, we need to set the environment variables we defined above.

The 2 environment variables should be:

  • NEW_ORDER_TOPIC_ARN (can be found in SNS)
  • S3_BUCKET_NAME (create a new bucket in S3, give it a unique name and use that same name as the value here)

env vars for Lambda function

Save the changes and return to the Lambda function page.

That's all for our setup!

Testing

To test our microservice we can either create a frontend or use the Lambda function.

Again for the sake of brevity we’ll test with Lambda.

Above the text editor, click on Test. A popup will open and you can define your test JSON in the JSON editor below, like so:

Lambda function test

Here’s the test JSON:

{
    "orderID": "order-101", 
    "customerEmail": "[email protected]",
    "orderDetails": {
        "item": "Apple Airpods Pro 3",
        "total": "3.99"
    }
}        

We only need to make a slight modification to our OrderDispatcher Lambda function to test it (you can revert the changes after testing).

Change the line at the top with exports.handler like so:

//change the code "JSON.parse(event.body)" to just "event" to test
exports.handler = async (event) => {
const { orderID, customerEmail, orderDetails } = event
//…rest of code        

Save the test and then run it.

You should see a success message “Order received and processing started”.

You should have received an email from AWS with the order details.

To make sure the order was saved to DynamoDB and receipt details saved to S3 you can individually check your DynamoDB table and S3 bucket (or view the CloudWatch logs for the Lambda function if you're familiar with that)

If that didn’t work properly, you can debug the Lambda functions by viewing the CloudWatch logs in the Monitor section of either Lambda functions.

If that worked properly, congrats! you just created a scalable, powerful and performant serverless order processing microservice!

If you need further help to debug and make this work, or if you have ideas on how to improve this microservice, don’t hesitate to comment below.

Conclusion

In this blog post, we explored how to build a scalable serverless microservice for processing e-commerce orders using AWS services.

We set up essential components including DynamoDB for order storage, SQS for managing order queues, SNS for sending email notifications, and Lambda functions to handle order processing and dispatching.

By following these steps, you can easily implement a robust serverless architecture to manage high-volume order processing efficiently.


If you enjoyed this post, please consider subscribing and sharing the newsletter with your network: https://www.dhirubhai.net/newsletters/7181386750357893121/


?? My name is Uriel Bitton and I hope you learned something in this edition of The Serverless Spotlight

?? You can share the article with your network to help others learn as well.

?? If you enjoyed this you can subscribe to my newsletter on Medium to get my latest articles on serverless and cloud computing by email.

?? my blog website is coming soon, so stay tuned for that.

?? I hope to see you in next week's edition!

Uriel

要查看或添加评论,请登录

社区洞察

其他会员也浏览了