Node.js Dev series (6 of 6)
In this post, we’re going to dive into building a file management service using AWS S3. This will be the final post.
If you want to check out the full GitHub repository for this series, you can find it here: GitHub Repo
As I mentioned last time, there are always more topics to explore, such as inter-service communication, serverless technologies, or authentication and authorization. I could even cover TypeScript, GraphQL, or various frameworks. But writing these posts takes time, so I’d only continue if I have free time and it’s truly useful for others.
What is AWS S3?
Amazon Simple Storage Service (S3) is a scalable object storage service that allows you to store and retrieve data from anywhere on the web. It’s widely used for:
For the final post, I wanted to do something a little different. This will introduce you to a new type of cloud service while also showcasing a powerful tool for AWS local development: LocalStack.
Let's start working on the code.
Setting Up the Documents Service
First, go to the services directory and create a new folder called documents. We can copy the contents from our previously developed service for convenience like we did before.
However, unlike before, we won’t need a models folder because the connection configuration for AWS S3 is simple enough to be handled in a single line of code. That said, I still created a models folder in the other services, even though I’m not actually using "models" in this project.
I said it in a previous post, but in a larger project, handling data access and conversion within route files would be a poor design choice. Ideally, we would follow a clean architecture with properly structured service layers. However, for the sake of simplicity, I’m keeping the familiar models and routes folder structure, even though I'm not strictly following the right design pattern.
Updating package.json
Now, let’s update the package.json file:
Here’s what’s new:
Setting Up config.js for AWS S3 Communication
Update config.js with the following:
Here's what each property represents:
What is LocalStack?
LocalStack is a fully functional local AWS cloud stack that allows developers to spin up and interact with AWS services on their local machine, mimicking AWS cloud services like S3, DynamoDB, Lambda, and more. It's an open-source tool that simplifies the development and testing of cloud applications without the need for an active internet connection or the expense of using live AWS services.
Why Use LocalStack?
Updating app.js
Now, let’s configure our main Express app. If you copied it from a previous service (like clients), simply update all mentions of client to document.
Implementing Document Management Endpoints
Now that we have set up our documents service, it's time to configure the routes. Navigate to the routes subfolder and rename the existing file to documents.js to reflect the new service name. Open the file. Let’s start is the code to initialize Express, AWS S3, and Multer for handling file uploads:
Explanation of new Components
This setup prepares us to define upload, retrieve, and delete operations for document management.
Now that we've set up AWS S3 and configured Multer for file uploads, we can define the API endpoints to handle uploading, retrieving, and deleting documents.
1. Upload a Document
Explanation:
2. Retrieve a List of All Documents
Explanation:
3. Download a Specific Document
Explanation:
4. Delete a Document
Explanation:
With these endpoints in place, we now have a fully functional file management microservice that leverages AWS S3 for cloud storage. You might be wondering why there’s no update endpoint this time. The reason is simple—if you think about it logically, the correct approach to updating a file is to delete the existing one and upload a new version in its place. This ensures consistency and avoids potential issues with partial updates or file corruption.
Setting up some Fixtures
With the code ready, it's time to set up some fixtures to help us with testing. We'll create sample data and an initialization script to ensure our local S3 storage is correctly configured before running our service.
Step 1: Create the Necessary Folders
Step 2: Add a Sample Document
Step 3: Create the Initialization Script
By setting up these fixtures, we ensure that:
? The S3 bucket is available for our service.
? There's already a sample document in the bucket for testing.
? Our API can interact with S3 storage right away.
Note: While I believe I've configured this correctly in the repository, if you're encountering issues with the bash script on a local Windows environment, it might be due to your file using Windows line endings. This can cause various problems in Docker, as the container runs in a Unix environment. To avoid this, ensure that your file uses Unix line endings. In Visual Studio Code, you can easily check this in the bottom-right corner of the window. It should display "LF" (Line Feed) rather than "CRLF" (Carriage Return + Line Feed). If it says "CRLF," you can change it by clicking on that label and selecting "LF."
Setting Up Docker Compose
Before testing our service, we need to configure Docker Compose to orchestrate our environment. This includes:
LocalStack mocks AWS services so we can develop and test locally without needing a real AWS account. Below is the docker-compose.yml configuration:
Explanation
Next, we define the document service, which runs our API:
Explanation
Well, there's nothing particularly new here. After building three Node.js services, the configuration in this one remains quite similar.
So, after adding this to docker-compose.yml, we can start everything with:
docker-compose up --build
Postman
Now, we are ready to test the document service in Postman:
A note on uploading a document using Postman: Since this is a new action, I thought it would be helpful to add an extra note on configuring Postman.
First, you'll need to set a custom header. In the Headers section, add a key called Content-Type with the value multipart/form-data.
Then, in the Body section, select form-data. Add a key called file, set the type to file, and choose a file from your computer for the value.
With this setup, we have a fully functional document storage microservice running locally using AWS S3 via LocalStack.
Final notes
And with that, we’ve reached the end of this series! Writing these posts has been a great experience, but I have to admit—it took a lot of time and effort. Since many concepts kept repeating, I found myself speeding through the last few posts, but I still tried to cover everything as thoroughly as possible. Sometimes, I tried to add a bit of context to spice things up and I think it wasn't that bad.
At the end of the day, my goal was to provide practical, real-world insights into building microservices with Node.js, Docker, and AWS—and I hope you found it valuable.
Now, it’s time for me to get back to coding! I can’t stay in writing mode forever. ??
Again, if you want to check out the full GitHub repository for this series, you can find it here: GitHub Repo
If you have any questions, comments, or feedback, feel free to reach out! I can’t promise an instant response, but I’ll do my best to help.
Thanks for following along, and happy coding!