Build A Serverless Order Processing E-commerce Microservice
Uriel Bitton
AWS Cloud Consultant | The DynamoDB guy | AWS Certified | I help you supercharge your DynamoDB database ??
?? Hello there! Welcome to The Serverless Spotlight!
In this week's Serverless Challenge Edition, I’ll guide you through building out a serverless microservice that will be able to process orders for an e-commerce type application at scale.
There are quite a few pieces involved to accomplish this but I’ll make it super simple to understand and reproduce for yourself.
So strap in and let’s get started!
Overview
The general overview of this microservice looks like this:
When a customer places an order through a Lambda function URL, the request is routed to a Lambda function which sends a message with the order details to an SQS queue.
Asynchronously, another Lambda function will be triggered whenever a new message is sent to SQS. The function will process the message which contains the order details and will write them to a DynamoDB table as well as write an order receipt as a file in Amazon S3.
The function then sends a notification to an SNS topic, which triggers an email alert for a new order.
Let’s get started building this out through the AWS console.
Implementation
DynamoDB
The first thing we need to create is a DynamoDB table. Our DynamoDB table will handle writes and reads of orders at a large scale.
Log into your AWS account and navigate to DynamoDB.
Create a new table and use the following configurations:
Create the table.
That’s all for DynamoDB.
SQS
Let’s now create an SQS queue to handle order requests queuing.
SQS will allow our Lambda function code to execute order processing asynchronously by accepting order jobs from the SQS queue, making our microservice infinitely more scalable.
Navigate to the SQS console in AWS.
Create a Queue.
Use the following configurations:
Create the Queue.
SNS
We’ll use SNS to send email notifications to order creation recipients every time an order is created.
A Lambda function - which we will create below - will be responsible for triggering these emails on an order received event.
Navigate to the SNS console.
Here, you can create a new SNS Topic.
You can now create the topic.
Scrolling down a little bit on the page of the new topic you just created, click on the Create subscription button.
On the create subscription page, in the details section you will see the Topic ARN is pre-filled. For the Protocol select Email from the dropdown to be able to send emails when the SNS topic is triggered.
For the Endpoint, enter your email address in the input.
Like you see in the popup notification, you must confirm the email after you create the topic.
Create the subscription and wait for the email from AWS Notifications and confirm that you want to subscribe to this topic.
When you confirm the subscription in your email, you will be redirected to a browser tab and will be able to see a message that the subscription has been confirmed.
Let's now head over to Lambda to write the server code for processing orders.
Lambda Function For Order Dispatching
In the Lambda console, create a new function with the following configurations:
Create the function.
Scroll a little down to find the Code section. In the text editor, copy and paste the following code:
import { SQSClient, SendMessageCommand } from "@aws-sdk/client-sqs";
const sqs = new SQSClient();
exports.handler = async (event) => {
const { orderID, customerEmail, orderDetails } = JSON.parse(event.body);
// Send message to SQS
const sqsCommand = new SendMessageCommand({
QueueUrl: process.env.ORDER_QUEUE_URL,
MessageBody: JSON.stringify({ orderID, customerEmail, orderDetails }),
});
await sqs.send(sqsCommand);
return {
statusCode: 200,
body: JSON.stringify({ message: "Order received and processing started!" }),
};
};
The Lambda function code imports the AWS SQS client library.
From the event variable, we extract the orderID, customerEmail, and orderDetails and send these to an SQS message queue.
With that done, we need to set the environment variables which we defined in the function code.
Scroll down a little and you’ll find the Configurations section. Then on the left side menu, select the Environement Variables section.
Click on Edit.
Add the environment variable for the SQS queue env var.
Key: ORDER_QUEUE_URL
Value: the SQS queue URL (can be found in your SQS console, in the queue details which we created earlier)
领英推荐
Save the changes and return to the Lambda function page.
We can now deploy the code by clicking on the Deploy button at the top of the text editor.
Once the function is deployed we need to do change the file extension of the index.mjs file to index.js. This can be done by renaming the file in the left hand sidebar and changing the extension to .js.
We now need to attach an SQS trigger to this function.
The SQS trigger will allow our Lambda function to get SQS message events and process them in our Lambda function.
Scrolling up to the top of the function page, you will find a button add trigger. Click on it.
On the Add Trigger page, for the source dropdown, find and select SQS from the list.
For the SQS queue add the OrderProcessingQueue we created earlier.
Click on save to add the trigger.
Lambda Function For Order Processing
We now need to create the function that will handle the main order creation and record the details.
Create a new Lambda function with the following configurations:
Create the function.
Scroll down to the Code section and copy and paste the following code.
import { DynamoDBClient, PutItemCommand } from "@aws-sdk/client-dynamodb";
import { SNSClient, PublishCommand } from "@aws-sdk/client-sns";
import { S3Client, PutObjectCommand } from "@aws-sdk/client-s3";
const dynamoDb = new DynamoDBClient();
const sns = new SNSClient();
const s3 = new S3Client();
exports.handler = async (event) => {
const QueueUrl = process.env.ORDER_QUEUE_URL;
const TopicArn = process.env.NEW_ORDER_TOPIC_ARN;
const s3Bucket = process.env.S3_BUCKET_NAME;
for (const record of event.Records) {
const { orderID, customerEmail, orderDetails } = JSON.parse(record.body);
// Save order to DynamoDB
const putItemCommand = new PutItemCommand({
TableName: ‘orders’,
Item: {
orderID: { S: orderID },
orderDetails: { S: JSON.stringify(orderDetails) },
customerEmail: { S: customerEmail },
},
});
await dynamoDb.send(putItemCommand);
// Publish to SNS
const snsCommand = new PublishCommand({
TopicArn,
Message: `New order received: ${orderID}`,
Subject: "New Order Notification",
});
await sns.send(snsCommand);
// Store order receipt in S3
const receipt = {
orderID
customerEmail,
orderDetails,
timestamp: new Date().toISOString(),
};
const putObjectCommand = new PutObjectCommand({
Bucket: s3Bucket,
Key: `${orderID}.json`,
Body: JSON.stringify(receipt),
ContentType: "application/json",
});
await s3.send(putObjectCommand);
}
return {
statusCode: 200,
body: JSON.stringify({ message: "Orders processed successfully!" }),
};
};
The code above imports the client libraries for DynamoDB, SNS and S3.
The function then intercepts an order event and extracts the order variables that were sent to SQS.
With these order detail variables we:
Finally, we return a success message that the order was processed successfully.
Like with the first Lambda function, we need to set the environment variables we defined above.
The 2 environment variables should be:
Save the changes and return to the Lambda function page.
That's all for our setup!
Testing
To test our microservice we can either create a frontend or use the Lambda function.
Again for the sake of brevity we’ll test with Lambda.
Above the text editor, click on Test. A popup will open and you can define your test JSON in the JSON editor below, like so:
Here’s the test JSON:
{
"orderID": "order-101",
"customerEmail": "[email protected]",
"orderDetails": {
"item": "Apple Airpods Pro 3",
"total": "3.99"
}
}
We only need to make a slight modification to our OrderDispatcher Lambda function to test it (you can revert the changes after testing).
Change the line at the top with exports.handler like so:
//change the code "JSON.parse(event.body)" to just "event" to test
exports.handler = async (event) => {
const { orderID, customerEmail, orderDetails } = event
//…rest of code
Save the test and then run it.
You should see a success message “Order received and processing started”.
You should have received an email from AWS with the order details.
To make sure the order was saved to DynamoDB and receipt details saved to S3 you can individually check your DynamoDB table and S3 bucket (or view the CloudWatch logs for the Lambda function if you're familiar with that)
If that didn’t work properly, you can debug the Lambda functions by viewing the CloudWatch logs in the Monitor section of either Lambda functions.
If that worked properly, congrats! you just created a scalable, powerful and performant serverless order processing microservice!
If you need further help to debug and make this work, or if you have ideas on how to improve this microservice, don’t hesitate to comment below.
Conclusion
In this blog post, we explored how to build a scalable serverless microservice for processing e-commerce orders using AWS services.
We set up essential components including DynamoDB for order storage, SQS for managing order queues, SNS for sending email notifications, and Lambda functions to handle order processing and dispatching.
By following these steps, you can easily implement a robust serverless architecture to manage high-volume order processing efficiently.
If you enjoyed this post, please consider subscribing and sharing the newsletter with your network: https://www.dhirubhai.net/newsletters/7181386750357893121/
?? My name is Uriel Bitton and I hope you learned something in this edition of The Serverless Spotlight
?? You can share the article with your network to help others learn as well.
?? If you enjoyed this you can subscribe to my newsletter on Medium to get my latest articles on serverless and cloud computing by email.
?? my blog website is coming soon, so stay tuned for that.
?? I hope to see you in next week's edition!
Uriel