Node.js Dev series (3 of 6)
Last time, we set up some endpoints with mock responses. Now, we’re going to take the next step by actually connecting to a MySQL database and returning real data.
Before that, we are going to make some key improvements to our Docker environment to enhance our development workflow. Currently, any code change in our Node.js app requires a manual restart of the container, which is frustrating and slows down development.
To address this problem, we’ll introduce Node Supervisor which we'll use to automatically restarts the Node.js app when changes are detected and offers additional debugging support.
If you want to check out the full GitHub repository for this series, you can find it here: GitHub Repo.
Let’s get started!
Dockerfile and docker-compose.yml configuration
Open the services/products/Dockerfile, and let’s make some improvements to enhance our development workflow.
Install Development Tool – After the RUN npm install line, add:
RUN npm install supervisor -g
This ensures supervisor is installed globally, independent of what's in package.json.
Final Dockerfile:
Next, open your docker-compose.yml file and locate the products service. Add the following line under the service definition (ensuring proper indentation):
command: "supervisor -w ./ ./app.js"
Why This Change?
- supervisor automatically restarts the Node.js server whenever it detects file changes, so you won’t have to rebuild or restart the container manually.
- -w ./ tells supervisor to watch the entire project directory for changes.
- ./app.js specifies the entry point of your application.
Now, rebuild your Docker image and restart the container to apply these changes! From now on, you can modify the Node.js code without needing to restart your container.
Connecting to MySQL
Now that we have our Docker environment ready for instant updates, we can replace our mock responses with a real implementation that connects to MySQL.
Step 1: Create the Configuration File
Navigate to services/products and create a new file called config.js.
This file exports a JavaScript object containing the configuration details for connecting to MySQL.
Step 2: Set Up the Database Connection
Next, inside services/products, create a new folder called models. Inside this folder, create a file called connection.js.
In this file:
- We import the MySQL library and use the credentials from config.js to establish a database connection.
- If an error occurs during the connection, the app will throw an error.
- If the connection is successful, it logs "Connected to MySQL database".
- Finally, we export the connection so it can be used elsewhere in the application.
Updating Endpoints to Use MySQL
Now, navigate to services/products/routes/products.js and add the following require statement at the top:
const connection = require("../models/connection");
This will import the MySQL connection we created earlier, allowing us to execute queries directly from our routes.
Step 1: Updating the Create Product Endpoint
Now, let's update our product creation endpoint to insert new products into the database:
Explanation:
- Extracts name, price, and description from the request body.
- Logs the request body for debugging.
- Uses a parameterized query (? placeholders) to prevent SQL injection and insert the new product into the database.
- Returns a 201 Created status with the newly inserted product’s details.
Note: If you have some coding experience, you might notice that something feels out of place—and you'd be right. Typically, updating the database directly within a routes file isn't considered best practice. A well-structured application would include additional layers, such as a models layer, to handle database interactions and data transformations. However, since the focus of this series is on learning Node.js fundamentals, we are prioritizing code simplicity over strictly adhering to software design patterns.
Let's continue!
Step 2: Updating the Get All Products Endpoint
Now, let's update the endpoint for retrieving all products:
Explanation:
- Logs "getting all products" to indicate the request is being processed.
- Executes a simple SELECT * FROM products query to fetch all records.
- If there's an error, it throws it; otherwise, it responds with the list of products in JSON format.
Testing Your Read and Creation Endpoints
Now that we have our MySQL connection set up, how about we go ahead and test the read (GET /products) and creation (POST /products) endpoints?
You’ll likely encounter some errors. Why? Even though our MySQL container is running successfully, the products database and table do not exist there yet.
Currently, when we start our containers, we are spinning up two services:
- A Node.js server
- A MySQL database container
However, the MySQL container starts empty—it’s just a database server without any preloaded data or schema. To fix this, we need to ensure our database and tables are created on startup by adding an initialization script (also known as database fixtures). We are going to fix this!
Step 1: Creating the Database Schema File
To ensure that the database and the products table are created every time MySQL starts, follow these steps:
1.???? Go to the root of your project and create the following folder structure: setup/mysql.
领英推è
2.???? Inside setup/mysql, create a file named: 000-schema.sql
3.???? Open 000-schema.sql and insert the following code:
What This Does:
- Creates the products database if it doesn’t already exist.
- Ensures that we’re using the correct database (USE products) before running further queries.
- Deletes any existing products table to prevent conflicts.
- Creates a new products table with fields for id, name, price, and description.
At this point, we could also insert some sample data, but for now, we’ll keep it simple and just create the database and table.
Step 2: Telling Docker to Execute the SQL Script on Startup
Now, we need to configure Docker Compose to automatically run this SQL script when the MySQL container starts.
1.???? Open docker-compose.yml
2.???? Locate the MySQL service section.
3.???? Add the following lines (make sure it’s correctly indented):
volumes:
- "./setup/mysql:/docker-entrypoint-initdb.d:delegated"
What This Does:
- It creates a volume that links our setup/mysql folder from our project to the MySQL container’s /docker-entrypoint-initdb.d directory.
- MySQL automatically executes any .sql files found in /docker-entrypoint-initdb.d during startup.
- This ensures that every time the database container is created, our products database and table will be set up automatically.
Final docker-compose.yml:
Now, restart your containers with:
docker-compose down
docker-compose up --build -d
This will rebuild your containers to include the new configuration, reinitialize the MySQL database, ensuring the schema is set up, and run everything in detached mode. Once that’s done, try testing your endpoints again.
It should work fine now, so let’s continue refining our API by implementing the rest of the endpoints.
Get Product by ID (GET /products/:id)
How It Works:
- Extracts the id from the request parameters (req.params).
- Queries the database for a product with the given ID.
- If a matching product is found, it returns the product as JSON.
- If no product is found, it returns a 404 Not Found error.
You can test this endpoint using Postman.
Updating a Product (PUT /products/:id)
Updating a product is slightly more complex because we only want to update the fields that are actually provided in the request.
The Basic Approach (and its Problems)
A simple way to update a product would be:
const { name, price, description } = req.body;
const query = `UPDATE products SET name = '${name}', price=${price}, description='${description}' WHERE id = ${id}`;
- Requires all fields to be present: If any field is missing from the request, it could be set to NULL in the database. This could cause unintended data loss or errors if the field doesn't allow NULL values.
- Potential SQL injection risks: Even though this example uses parameterized queries, it's generally a bad practice to concatenate raw values into queries.
A Smarter Approach: Processing Only Provided Fields
To solve these issues, we create a function that dynamically maps only the fields that exist in req.body:
How This Works:
- It loops through req.body, extracting only the fields that are present.
- It formats them into key = value pairs, ensuring that string values are enclosed in quotes.
- It returns a valid SQL update statement that only includes the provided fields.
Now, let’s implement this in our update product endpoint:
What This Does:
- Extracts id from the request parameters.
- Logs the update operation for debugging.
- Uses mapValidEntriesToString(req.body) to generate a dynamic update query that only updates the fields present in the request.
- Executes the query using connection.query().
- Returns the updated product details in the response.
Note: Is this really the best approach? Of course not. There are better ways to handle this, such as using an ORM like Sequelize to abstract raw SQL handling, or implementing input validation and sanitization with libraries like Joi. These methods enhance security, maintainability, and scalability. However, for the sake of simplicity and learning, we're focusing on a more straightforward approach here. Feel free to research these tools and improve your own code.
Updating the Delete Endpoint
The final step in our CRUD operations is implementing the Delete Product by ID endpoint.
Code Implementation:
How This Works:
- Extracts the id from the request parameters (req.params).
- Executes a DELETE query to remove the product from the database.
- If the query runs successfully, it returns HTTP 204 (No Content), which is the standard response for successful deletions.
Testing the Delete Endpoint:
You can test this endpoint using Postman or curl:
- Send a DELETE request to /products/:id, replacing :id with an actual product ID.
- If successful, the response will be empty (204 No Content).
- Try retrieving the same product again—it should now return a 404 Not Found.
?? First Microservice Completed!
With this, we now have a fully functional Node.js microservice that:
- Supports CRUD operations (Create, Read, Update, Delete).
- Uses MySQL for data storage.
- Runs inside a Dockerized environment with live updates.
What’s Next?
In the next post, we’ll explore NoSQL databases, specifically MongoDB, and implement a similar microservice that interacts with a document-based data model. Stay tuned!