Node.js Dev series (4 of 6)

In this post, we will focus on creating a microservice that interact with a NoSQL database. Since we've already built a service that connects to MySQL, developing a NoSQL-based service seems like a logical next step.

If you want to check out the full GitHub repository for this series, you can find it here: GitHub Repo.


Optimizing Our Docker Setup: Reusing the Dockerfile for Multiple Services

Currently, we have a Dockerfile in products service folder. If we follow this approach, each Node.js service will have its own Dockerfile. However, since all our Node.js services will use the same build process, we could just reuse a single Dockerfile instead of maintaining separate copies for each service.

1. Move the existing Dockerfile

  • Navigate to the services folder.
  • Move the Dockerfile from services/products to services/.

2. Modify the Dockerfile to Accept a Build Argument

  • The existing COPY package.json . line assumes the Dockerfile is inside each service folder, which is no longer the case.
  • Instead, we will pass the service name as a build argument to dynamically specify the correct directory:

ARG SERVICE
COPY ./$SERVICE/package.json .        


3. Update docker-compose.yml to Pass the Service Argument

  • Modify the products service definition to specify which service should be built:

products:
  build:
    context: ./services
    args:
      - SERVICE=products        


That configuration should allow us to reuse the Dockerfile later. Let's get coding!


Setting Up the MongoDB Microservice

MongoDB is a popular, open-source NoSQL database that is designed for handling large volumes of unstructured or semi-structured data. Unlike traditional relational databases that store data in tables and rows, MongoDB uses a flexible, document-oriented approach. It stores data in BSON (Binary JSON) format, which is similar to JSON, allowing for rich, complex data structures like arrays and nested objects.

Since MongoDB is really popular, we’re going to create a new microservice that interacts with MongoDB. We already have a working structure in the products service, I'm going to show you how easily we can copy and modify most of that code and have the new service running in no time.

Inside the services directory, create a new folder called: ?services/suppliers

  • Copy all files from services/products into services/suppliers.

Modify package.json

Open services/suppliers/package.json and update the following:


Key Changes:

  • Updated name → "suppliers-service"
  • Replaced MySQL with MongoDB → "mongodb": "^4.1.4"

Now that we've set up the package.json, let's update the service to use suppliers instead of products.

Update app.js

In services/suppliers/app.js:


  • Replace every instance of "product" with "supplier".
  • Update the route imports:

const supplierRoutes = require("./routes/suppliers");
app.use("/suppliers", supplierRoutes);        


  • Rename the Routes File: Go to the routes subfolder and rename products.js → suppliers.js

This ensures requests to /suppliers are routed correctly.

Update config.js

Modify services/suppliers/config.js to store MongoDB connection details:


We remove MySQL connection details, and specify MongoDB connection URL and database name.

Modify models/connection.js

Remove all existing code and replace it with:

  • Uses MongoDB's native driver to establish a connection.
  • Exports getDb() to access the database in other parts of the service.

Updating routes/suppliers.js to Handle MongoDB CRUD Operations

Now that we’ve set up MongoDB, the final step is to update routes/suppliers.js to work with it instead of MySQL.

1. Remove MySQL-Specific Code

  • Delete the require statement related to MySQL.
  • Remove the mapValidEntriesToString function (not needed for MongoDB).

2. Add MongoDB-Specific Imports

Replace the removed code with:

const { ObjectID } = require("mongodb");
const { getDb } = require("../models/connection");        

Why these imports?

  • ObjectID: MongoDB uses string-based unique IDs instead of numeric ones.
  • getDb(): Retrieves the database connection from connection.js.

3. Implement the "Create Supplier" Endpoint

Since MongoDB uses collections and documents instead of tables and records, the logic is different.

Replace the existing creation logic with:


Key Differences from MySQL:

  • Instead of INSERT INTO suppliers (...) VALUES (...), we use insertOne().
  • The collection name "suppliers" is not a table. As I said before, MongoDB is document based: Instead of tables with foreign key relationships, MongoDB uses collections of documents. Each collection could be analogous to a table in a relational database, but instead of storing rows, it contains documents. Additionally, in MongoDB, relationships are often handled through embedding (storing related data within a single document) instead of using complex joins like in relational databases.
  • The response returns the entire inserted document, including the auto-generated _id.

4. Implementing the Read Endpoints for Suppliers

Now that we’ve added supplier creation, let’s implement the read endpoints to retrieve supplier data from MongoDB.

4.1. Fetch All Suppliers

This endpoint retrieves all suppliers from the MongoDB collection.

How it works:

  • Calls .find() to get all documents in the "suppliers" collection.
  • Converts the result to an array using .toArray().
  • If successful, responds with the list of suppliers.
  • If an error occurs, returns a 500 Internal Server Error.

4.2. Fetch a Single Supplier by ID

This endpoint retrieves a specific supplier using the unique _id.

How it works:

  • Extracts the id from request parameters.
  • Uses findOne({ _id: new ObjectID(id) }) to locate the document.
  • If the supplier doesn’t exist, returns 404 Not Found.
  • If there's an error, returns 500 Internal Server Error.

MongoDB uses ObjectIDs, so we must convert id into an ObjectID before querying. If the id is not a valid ObjectID, MongoDB will throw an error.

5. Updating a Supplier by ID

This endpoint allows updating a supplier's details using their unique _id.

How it Works:

  1. Extracts id from the request parameters.
  2. Extracts the fields to update (name and contact) from the request body.
  3. Calls updateOne() on the "suppliers" collection, using: { id: new ObjectID(id) } → Finds the supplier by id. { $set: removeNullUndefined({ name, contact }) } → Updates only non-null values.
  4. If an error occurs, returns 500 Internal Server Error.
  5. If successful, responds with the MongoDB update result object.



Why removeNullUndefined()?

This helper function ensures that only provided fields are updated. Without it, empty values might overwrite existing data as null or undefined. If a field is missing from the request, it won’t be modified. So let’s add removeNullUndefined helper function after the require declarations:

const removeNullUndefined = (obj) =>
  Object.fromEntries(Object.entries(obj).filter(([_, v]) => v != null));        

How It Works:

1. Object.entries(obj) Converts the object into an array of key-value pairs.

Example:

{ name: "Alice", contact: null }

turns into

[["name", "Alice"], ["contact", null]]

2. .filter(([_, v]) => v != null)

Keeps only the key-value pairs where the value is not null or undefined.

v != null is a shorthand for v !== null && v !== undefined.

Example result:

[["name", "Alice"]]

3. Object.fromEntries(...)

Converts the filtered array back into an object.

Example output:

{ name: "Alice" }

This shows how removed with the helper function all null or undefined properties from the object.


6. Deleting a supplier by Id


Step-by-Step Explanation

1. Extracting the ID from the request parameters

const { id } = req.params;        

2. Converting the ID to MongoDB's ObjectID format

{ _id: new ObjectID(id) }        

3. Executing the deleteOne query

db.collection("suppliers").deleteOne(

  { _id: new ObjectID(id) },

  (err, result) => { ... }

);        

4. Deletes only one document that matches the given _id.

5. Handling Errors

if (err) {

  res.status(500).json({ error: "Failed to delete supplier" });

  return;

}        

6. If an error occurs (e.g., database issues), it returns HTTP 500 with an error message.

7. Returning a 204 No Content Response

res.status(204).send();        

204 No Content means the request was successful, but there's no response body. This is the correct status code for a successful DELETE operation.



You might be thinking: This is a hoax! You just copy-pasted a bunch of code! And you're absolutely right—I did. A significant part of a developer’s work involves copy-pasting. Whether you write every line from scratch or copy and modify existing code, what truly matters is delivering solutions quickly and efficiently while maintaining quality.

There’s no shame in copy-pasting—as long as you know what you’re doing. The key is having the judgment to use the right code in the right context. If you blindly copy and paste random pieces of code without understanding them, you’re setting yourself up for major problems. It’s crucial to grasp the logic behind what you’re implementing, the nuances between different platforms, and the philosophies behind various technologies.

For instance, databases serve different purposes:

  • SQL relational databases excel at handling complex queries with joins and are highly flexible, making them great for both transactions and analytics.
  • NoSQL databases like MongoDB are optimized for fast read/write operations and scalability, making them ideal for high-traffic applications.
  • Data warehouses are designed specifically for analytical processing, optimized for running massive queries over vast amounts of data.

When you understand why technologies work the way they do, you can leverage any piece of code effectively. That’s why reading documentation, understanding best practices, and learning from resources like this tutorial are invaluable.

With that in mind—get comfortable with copy-pasting, because we’ll be doing plenty of it to avoid reinventing the wheel and focus on solving real problems efficiently.


Set Up MongoDB Fixtures

Before testing this, we need to set up database fixtures for MongoDB, just like we did for MySQL.

  1. Navigate to the setup folder
  2. Create a new subfolder named mongodb
  3. Inside mongodb, create a file called mongo-init.js
  4. Add the following code to initialize the database:

db.suppliers.insertMany([

  { name: "first 1", contact: "[email protected]" },

  { name: "second 2", contact: "627276543" },

]);        

What This Script Does

  • Inserts two initial supplier records into a new suppliers collection.
  • Automatically creates the collection if it doesn’t already exist.
  • Ensures that the database has some preloaded data for testing.



Adding Services to docker-compose.yml

Now we’re at the final step before testing: adding MongoDB and our Node.js suppliers service to the docker-compose.yml file.

Add the MongoDB Service

What This Does:

  • Creates a MongoDB container using the official mongo image.
  • Defines environment variables to set up authentication (admin:password).
  • Exposes port 27017, making it accessible locally.
  • Mounts our initialization script (mongo-init.js) to automatically preload data on startup.
  • Connects to the local-network so other services (like our Node.js backend) can communicate with it.

Add the suppliers Node.js Service

Now we need to add the suppliers service to docker-compose.yml, ensuring it's properly configured to interact with MongoDB and supports hot-reloading for development.

New Settings:

  • build.context: ./services → Builds from the shared Dockerfile in the services folder.
  • args: - SERVICE=suppliers → Informs the build process that we are working on the suppliers service. We did this to reuse the Dockerfile in all services.

Settings that we can also find in products:

  • expose: - "5003" → Exposes port 5003 internally within the Docker network.
  • ports: - "5003:5003" → Maps port 5003 from the container to 5003 on the host machine for external access.
  • environment: NODE_ENV=development → Runs the service in development mode. PORT=5003 → Ensures the app listens on port 5003.
  • depends_on: - mongodb → Ensures that MongoDB starts before this service.
  • volumes:

o?? "./services/suppliers:/app:cached" → Mounts the local project directory into the container for live code updates.

o?? "/app/node_modules" → Ensures dependencies are properly managed inside the container.

  • command: "supervisor -w ./ ./app.js" → Uses Node Supervisor to automatically restart the app when files change, enabling hot-reloading in development.
  • networks: - local-network → Ensures communication with other containers like mongodb.

Now, we can build and start everything.


Testing the Suppliers API with Postman

Now that our MongoDB-based suppliers service is running, it's time to test the API using Postman. Basically, you can duplicate products endpoints making sure to modify the necessary parameters.

  1. Duplicate the existing products API requests.
  2. Rename all occurrences of "products" to "suppliers".
  3. Update the port from 5001 → 5003.
  4. Modify request bodies to reflect MongoDB's format: Use ObjectIDs instead of numeric IDs. Ensure fields match supplier data (e.g., name, contact).

Example API Requests

1. Create a Supplier

{

  "name": "Tech Supplies Co.",

  "contact": "[email protected]"

}        

  • Expected Response (201 Created):

{

  "acknowledged": true,

  "insertedId": "65a9f12e8b5e1a6e4d2c9b1f"

}        

2. Get All Suppliers

[

  {

    "_id": "65a9f12e8b5e1a6e4d2c9b1f",

    "name": "Tech Supplies Co.",

    "contact": "[email protected]"

  }

]        

3. Get a Supplier by ID

{

  "_id": "65a9f12e8b5e1a6e4d2c9b1f",

  "name": "Tech Supplies Co.",

  "contact": "[email protected]"

}        

4. Update a Supplier

{

  "name": "Updated Tech Supplies Co.",

  "contact": "[email protected]"

}        

  • Expected Response (200 OK):

{

  "matchedCount": 1,

  "modifiedCount": 1,

  "acknowledged": true

}        

5. Delete a Supplier

Expected Response:204 No Content

What’s Next?

Next time, we’ll explore Amazon DynamoDB, a NoSQL database which also introduces us to AWS services. It’s a bit different from MongoDB, so I thought it would be worthwhile to explore in more detail. If you have any comments or suggestions please let me know.

要查看或添加评论,请登录

Maximiliano Goffman的更多文章

  • Node.js Dev series (6 of 6)

    Node.js Dev series (6 of 6)

    In this post, we’re going to dive into building a file management service using AWS S3. This will be the final post.

  • Node.js Dev series (5 of 6)

    Node.js Dev series (5 of 6)

    In our last session, we built a microservice that interacts with MongoDB. This time, we’re shifting our focus to…

  • Node.js Dev series (3 of 6)

    Node.js Dev series (3 of 6)

    Last time, we set up some endpoints with mock responses. Now, we’re going to take the next step by actually connecting…

  • Node.js Dev series (2 of 6)

    Node.js Dev series (2 of 6)

    Last time, we successfully set up a simple Node.js server that logs a message when running inside a Docker container.

  • Node.js Dev series (1 of 6)

    Node.js Dev series (1 of 6)

    Tackling a new challenge is thrilling—it’s what developers live for, after all. On the other hand, starting from…

社区洞察

其他会员也浏览了