Mastering Bulk Data Handling in DynamoDB with BatchWriteItem
Uriel Bitton
AWS Cloud Engineer | The DynamoDB guy | AWS Certified & AWS Community Builder | I help you build scalable DynamoDB databases ????
Welcome to the 17th edition of Excelling With DynamoDB!
In this week’s issue we'll learn how to write data in bulk using DynamoDB's API to achieve more efficient and optimized writes to our database.
What Is BatchWriteItem?
An essential but lesser known feature used to manage data in DynamoDB is the BatchWriteItem method.
BatchWriteItem allows you to write or delete multiple items at scale with a single request to DynamoDB.
This is particularly useful when working with large datasets and optimizing data writes and deletes to reduce latency.
How BatchWriteItem Works
With BatchWriteItem, you can also write or delete items across multiple tables with a single API call.
A single call to BatchWriteItem can process up to 16MB of data and can have up to 25 item put or delete operations. [1]
The batch operation is atomic for each individual item, but not atomic across the entire batch.
This means that if one item write or delete operation fails, the rest will still be processed.
Benefits Of BatchWriteItem
There are several key benefits of using BatchWriteItem:
Working With BatchWriteItem
Let's take a look at a typical example of using BatchWriteItem to write multiple items to our database.
I'll be using the API for Node JS to write the code.
Assuming you have created a table in DynamoDB - with a partition key of "productID" - we can write the following code to support batch writing/deleting of items in this table.
Let's start by importing and initializing the DynamoDB client library.
import { DynamoDBClient } from "@aws-sdk/client-dynamodb";
import { BatchWriteCommand, DynamoDBDocumentClient } from "@aws-sdk/lib-dynamodb";
// Create DynamoDB client
const client = new DynamoDBClient({ region: "us-east-1" });
const ddbDocClient = DynamoDBDocumentClient.from(client);
Let's now define the items to write. We create two items with Put requests. One with a productID of "101" and one with productID of "102".
Optionally we can also define items to delete in the same batch process. We can specify the productID of items we wish to delete.
领英推荐
const putItemsBatch = [
{ PutRequest: { Item: { productID: "101", name: "Apple Airpods Pro 3" } } },
{ PutRequest: { Item: { productID: "102", name: "Apple Macbook M3 Pro" } } },
];
//delete requests can also be done here optionally:
const deleteItems = [
{ DeleteRequest: { Key: { productID: "301" } } },
{ DeleteRequest: { Key: { productID: "302" } } },
];
We then define the params of our command. Inside the RequestItems we define the table name as the key and the putItemsBatch as the value.
const tableName = "products"
const params = {
RequestItems: {
[tableName]: putItemsBatch,
},
};
Finally, we can call the BatchWriteCommand with the params we defined to DynamoDB.
const batchWriteItems = async () => {
try {
const result = await ddbDocClient.send(new BatchWriteCommand(params));
console.log("Batch write successful:", result);
} catch (err) {
console.error("Batch write failed:", err);
}
};
Run the code and you should see the items being written to your DynamoDB table.
I've written this code in Lambda to easily test it against my DynamoDB table. I recommend you do the same.
You can follow this guide on creating and running AWS Lambda functions as well as running tests.
Conclusion
DynamoDB’s BatchWriteItem API provides a powerful method to optimize your database interactions, particularly when dealing with large datasets.
By batching multiple write or delete operations into a single request, BatchWriteItem significantly reduces latency, optimizes throughput, lowers costs, and simplifies your code.
As demonstrated, setting up and using BatchWriteItem in a Node.js environment is straightforward and can be easily tested using AWS Lambda.
If you enjoyed this post, please consider subscribing and sharing the newsletter with your network: https://www.dhirubhai.net/newsletters/7181386750357893121/
?? My name is Uriel Bitton and I hope you learned something in this edition of Excelling With DynamoDB.
?? You can share the article with your network to help others learn as well.
?? If you enjoyed this you can subscribe to my newsletter on Medium to get my latest articles on DynamoDB and serverless by email.
?? I hope to see you in the next week's edition!
References