Backend for frontends (BFF) in Node

Backend for frontends (BFF) in Node

#nodejs #apirest #javascript

Check this points when you create a new backend for frontends (BFF), middleends or APIs with Node.js to avoid problems in the future

Listen events in the current?process

It’s important to listen the current events of Node.js process to react when errors occur. The object process give information and control about the current process is running

unhandledRejection: It’s an event that is executed when a Promise is rejected and there is no associated handler to handle it

const process = require("node:process")
const express = require("express");
const app = express();

const PORT = 3000 || process.env.PORT;

function simulateUnhandledRejection() {
  if (Math.random() < 0.5) {
    Promise.reject("generic error");
  }
  return Promise.resolve();
}

app.get("/service", (_, res) => {
  simulateUnhandledRejection();
  return res.status(200).json({ msg: "hello" });
});

app.listen(PORT, () => {
  console.info(`App running in port ${PORT}`);
});

process.on("unhandledRejection", (reason, promise) => {
  console.error('The current process has "unhandledRejection"', {
    reason,
    promise,
  });
  process.exit(1); // optional -> kill the process
});        

In this case the router not handle correctly the exception, and the event is executed and log the error, for example log the error to datadog, newrelic to see the unhandled errors and fixit

uncaughtException: It’s an event when a Javascript exception isn’t handled by any try / catch or handler, Node by default print’s from the stack trace and kill the process with a process.exit(1), to have more visibility it’s convenient to have a log and then kill the process

const process = require("node:process")
const express = require("express");
const app = express();

const PORT = 3000 || process.env.PORT;

app.listen(PORT, () => {
  console.info(`App running in port ${PORT}`);
});

process.on("uncaughtException", (error, origin) => {
  console.error('The current process has "uncaughtException"', {
    error,
    origin,
  });
  process.exit(1);
});

variable_not_defined;        

In this case the app exit when read the variable_not_defined

Signal Events: Events that are received by the Node.js process, signal these can be for example, kubernetes when wanting to kill a pod that isn’t responding correctly

const process = require("node:process")
const express = require("express");
const app = express();

const PORT = 3000 || process.env.PORT;

app.listen(PORT, () => {
  console.info(`App running in port ${PORT}`);
});

/**
 * Example to simulate
 * 
 * 1. Start the server
 * 2. Close the app using Ctrl + D in the terminal
 */
process.on("SIGTERM", () => {
  console.error('App has received "SIGTERM" event closing the server');
  process.exit(1);
});

/**
 * Example to simulate
 * 
 * 1. Start the server
 * 2. Use the command `lsof -i :3000` to identify the PID process
 * 3. Use the command `kill PID` to kill the App
 */
process.on("SIGINT", () => {
  console.error('App received "SIGINT" event closing the server');
  process.exit(1);
});        

Listener Express: When starting an Express server, as a good practice, it is convenient to listen to the errors when starting the App when the listen method is called

const express = require("express")
const app = express();

const PORT = 3000 || process.env.PORT;

app
  .listen(PORT, () => {
    console.info(`App running in port ${PORT}`);
  })
  .on("error", (error) => {
    if (error.code === "EADDRINUSE") {
      console.error(
        `App can't listen in the ${PORT} the port is already used by other process`
      );
    } else {
      console.error("App can't start", error);
    }
  });        

Validate the incoming?request

It’s important to validate the incoming request to avoid errors in our services, such as sanitizing data, invalid payload. For example this libraries allows to validate the schema of the payload and retrieve a list of invalid fields

const express = require("express")
const { z } = require("zod");
const app = express();

const PORT = 3000 || process.env.PORT;

// This middlewares is to handle JSON payloads in body
app.use(express.json());
app.use(express.urlencoded({ extended: true }));

// Create the schema to validate
const ValidateRequest = z.object({
  location: z.object({
    latitude: z.number({
      required_error: "latitude is required",
      invalid_type_error: "latitude must be a number",
    }),
    longitude: z.number({
      required_error: "longitude is required",
      invalid_type_error: "longitude must be a number",
    }),
  }),
  id: z.number({
    required_error: "id is required",
    invalid_type_error: "id must be a number",
  }),
});

// Validate the current request before to handle the data
function validateRequest(req, res, next) {
  try {
    const { location, id } = req.body;
    ValidateRequest.parse({ location, id });
    return next();
  } catch (err) {
    const { issues } = err;
    return res.status(400).json({ msg: "Invalid request", issues });
  }
}

app.post("/my-service", validateRequest, (_, res) => {
  return res.json({ msg: "payload is correct!!" });
});

app.listen(PORT, () => {
  console.info(`App running in port ${PORT}`);
});        

In this example validate the request with zod library before the handler if the payload is correct is pass, if not sent to the client a bad request (400) with the invalid fields

Middleware for?404

The application must have a basic middleware to be able to handle non-existent routes and send the client a 404, by default “Express” sends an HTML with the message “Cannot {METHOD} /{endpoint}”

const express = require("express")
const app = express();

const PORT = 3000 || process.env.PORT;

app.get("/my-service", (_, res) => res.json({ msg: "hello" }));

app.listen(PORT, () => {
  console.info(`App running in port ${PORT}`);
});

// Add the middleware with a message for the user
app.use("*", (_, res) => {
  return res.status(404).json({
    message: "API route not found, check the swagger docs for more info",
  });
});        

The middleware must go to the end of all existing routes, as it works as a fallback if no route matches

Middleware for?errors

The application must have an error middleware to be able to handle execptions in the whole App, by default “Express” sends an HTML with the strackStrace, which isn’t ideal to send sensitive information to the client

const express = require("express")
const app = express();

const PORT = 3000 || process.env.PORT;

/**
 * Create the errors to send to the client
 */
class AppError extends Error {
  constructor(tag, message, status = 500, cause) {
    super(message);
    this.tag = tag;
    this.status = status;
    this.cause = cause;
  }
}

// Simulate a generic error, variable not defined
app.get("/my-service", (_, res) => {
  my_variable;
  res.json({ msg: "hello" });
});

// Simulate a error application
app.get("/my-service-2", (_, __, next) => {
  return next(new AppError('MyService2', 'Our engineers are working to fix', 503, "The data is not ready"));
});

app.listen(PORT, () => {
  console.info(`App running in port ${PORT}`);
});

// Example middleware error, log the error and sent the error friendly to the client
app.use((error, _, res, next) => {
  if (error) {
    const { tag = "TAG", status = 500, message = "", cause = "" } = error;
    console.error(
      `[${tag}] ${status} - ${message} - [Cause] ${JSON.stringify(cause)}`
    );
    return res
      .status(status)
      .json({ message: message || "Internal Server Error" });
  }
  return next();
});        

Middleware for?auth

If the application has routes that need information about the user or are protected routes, the simple way is to create a middleware to handle authentication

  • globally: Validate all the routes

app.use('/v1', authMiddleware, api);        

  • by route: Validate only the route

const express = require("express");
const app = express();

const PORT = 3000 || process.env.PORT;

// Example middleware auth simulate token is valid or not like jwt
function authMiddleware(_, res, next) {
  if (Math.random() < 0.5) {
    req.user = {}; // add extra metadata to the request
    return next();
  }
  return res.status(401).json({ msg: "invalid token" });
}

// Attach the middleware auth for this endpoint only
app.get("/my-service", authMiddleware, (_, res) => res.json({ msg: "hello" }));

app.listen(PORT, () => {
  console.info(`App running in port ${PORT}`);
});        

Middleware for response?headers

The default application sends some response headers to the client, each route can add additional headers like ‘cache-control’

const express = require("express");
const app = express();

const PORT = 3000 || process.env.PORT;

// Example basic response middleware headers
app.use((_, res, next) => {
  res.set("Access-Control-Allow-Origin", "*");
  res.set(
    "Access-Control-Allow-Methods",
    "GET, POST, PUT, DELETE, PATCH, OPTIONS"
  );
  res.set(
    "Access-Control-Allow-Headers",
    "X-Requested-With, content-type, Authorization"
  );
  res.set("Referrer-Policy", "strict-origin-when-cross-origin");
  next();
});

app.get("/my-service", (_, res) => res.json({ msg: "hello" }));

app.listen(PORT, () => {
  console.info(`App running in port ${PORT}`);
});        

Documentation with open?API

The app must have documentation to be able to know what services it exposes, such as the request and response of the services

If the application is a Rest API, the open API convention is followed

Example documentation with swagger-ui-express

const express = require("express");
const { serveFiles, setup } = require("swagger-ui-express");
const swaggerDocument = require("./openapi.json");

const app = express();

const PORT = 3000 || process.env.PORT;

const configOpenAPI = {
  swaggerOptions: {
    url: "/api-docs/swagger.json",
  },
};

// Attach the Swagger documentation
app.get("/api-docs/swagger.json", (_, res) => res.json(swaggerDocument));
app.use(
  "/api-docs",
  serveFiles(null, configOpenAPI),
  setup(null, configOpenAPI)
);

app.get("/my-service", (_, res) => res.json({ msg: "hello" }));

app.listen(PORT, () => {
  console.info(`App running in port ${PORT}`);
});        

Yaml config open API

{
  "openapi": "3.0.3",
  "info": {
    "title": "My Api",
    "description": "Api documentation for the bff",
    "termsOfService": "https://swagger.io/terms",
    "version": "0.0.1"
  },
  "servers": [
    {
      "url": "https://localhost:3000"
    }
  ],
  "tags": [
    {
      "description": "bff operations",
      "name": "bff hello"
    }
  ],
  "paths": {
    "/my-service": {
      "get": {
        "summary": "get info from my-service",
        "description": "get the hello message",
        "responses": {
          "200": {
            "description": "Successful"
          }
        },
        "tags": ["bff hello"]
      }
    }
  }
}        
No alt text provided for this image
Swagger API specs

Scaling

When scaling a Node.js application there are 2 most common approaches

  • horizontal scaling
  • vertical scaling

The most recommended for Node.js and simple is to scale horizontally using Kubernetes as traffic increases new instances (nodes) are created.

In Node it can be scaled using the “cluster” mode that allows you to create multiple node instances to distribute the load. In this case, one process is “master” and the rest “slave” if the master fails, another process is created and another master is chosen again, any process that is terminated creates a new one

const express = require("express");
const cluster = require("cluster");
const os = require("os");

const PORT = process.env.PORT || 3001;

// If the current is master, create the fork for all the CPUs
if (cluster.isMaster) {
  for (let i = 0; i < os.cpus().length; i++) {
    cluster.fork();
  }
  cluster.on("exit", (worker) => {
    console.error(`Worker has exit ${worker.process.pid}, creating another...`);
    cluster.fork();
  });
} else {
  const app = express();

  // Endpoint for health check API
  app.get("/health", (_, res) => {
    res.status(200).json({ message: "ok", process: process.pid });
  });

  app.listen(PORT, () => {
    console.info(`App listen in ${PORT} - process ${process.pid}`);
  });
}        

There is a pm2 tool that performs the same action automatically https://pm2.keymetrics.io/

Perform load?tests

It’s important to run performance tests to check if the app has any problem to handle a lot of users concurrently, memory leaks, handlers errors

The are many tools to do this step

import http from "k6/http";
import { Rate, Trend } from "k6/metrics";

// Metrics
const myWaitingTime = new Trend("waitingTime");
const myFailRate = new Rate("errorRate");

// URL to run the test, change for production env
const URL = "https://localhost:3001";

// Configuration for k6
export const options = {
  // Discard the response for better performance
  discardResponseBodies: true,
  // Define the scenarios for the test
  scenarios: {
    getMyService: {
      executor: "ramping-arrival-rate",
      exec: "getMyService",
      preAllocatedVUs: 0,
      maxVUs: 20,
      timeUnit: "1m",
      startRate: 0,
      stages: [
        { duration: "1m", target: 50 },
        { duration: "1m", target: 100 },
        { duration: "1m", target: 300 },
        { duration: "1m", target: 100 },
        { duration: "1m", target: 50 },
        { duration: "1m", target: 0 },
      ],
    },
  },
};

// Scenario to get my service
export function getMyService() {
  http.get(`${URL}/my-service`);
}

// Metrics to track in the dashboard after the test
export function metrics(response) {
  myWaitingTime.add(response.timings.waiting);
  myFailRate.add(response.status !== 200);

  check(response, { "status was 200": (r) => r.status == 200 });
}        

Example with k6, create a basic script to simulate a ramping arrival rate,?increase the users, for example an ecommerce with a blackfriday event

  1. Call the service with 50 users for 1 minute
  2. Call the service with 100 users for 1 minute
  3. Call the service with 300 users for 1 minute
  4. Call the service with 100 users for 1 minute
  5. Call the service with 50 users for 1 minute
  6. Call the service with 0 users for 1 minute

Profilling

Know the basic tools to be able to do a profilling if any production problem occurs, be it memory, event loop

Use the flags provided by node to find simple errors when start the app node

  • — trace-warnings: Retrieve the stack trace where is the warning coming from
  • — trace-deprecation: Show any feature is applying deprecated features
  • — trace-sync-io: Show warnings with the trace where are using sync operations to fix
  • — unhandled-rejections=strict: Force to apply “strict” to check unhandling exceptions in the code
  • — trace_gc: Trace garbage collector in the current process

NODE_ENV=production node — trace-warnings server.js

Use a tool like https://clinicjs.org/doctor/ to see analyze any error

If you have datadog use the

Know to?debug

Use vcode debugger if console.log is not enough

{
  "version": "0.2.0",
  "configurations": [
    {
      "name": "API debugger",
      "type": "node-terminal",
      "request": "launch",
      "command": "npm dev"
    }
  ]
}        

HttpClient

Create a robust wrapper httpclient that allows handling correctly

  • Retries: Allows to retry the request given a certain time
  • Timeouts: Avoid socket hangout, like axios by default timeout is 0
  • Circuit Breaker Pattern: Avoid call a microservice when is fail, fallback service
  • KeepAlive Connection: Avoid create a new connection for requests in the future

Example a httpClient wrapper using this libraries

import axios, { AxiosInstance, CreateAxiosDefaults } from "axios";
import axiosRetry, { exponentialDelay } from "axios-retry";
import Agent from "agentkeepalive";
import CircuitBreaker, { Options } from "opossum";

const TIMEOUT_DEFAULT = 2500;
const MAX_SOCKETS = 100;
const MAX_FREE_SOCKET = 10;
const TIME_OUT_KEEP_ALIVE = 60000;
const FREE_SOCKET_TIMEOUT = 30000;
const RESET_TIMEOUT_CIRCUIT_BREAKER = 15000;
const ERROR_THRESHOLD_PERCENT_CIRCUIT_BREAKER = 25;
const RETRIES_DEFAULT = 0;

const logger = console.log as any; // reeplace with your logger

/**
 * Create the configuration for agent http
 * @returns {Agent}
 */
function createHttpAgent() {
  return new Agent({
    maxSockets: MAX_SOCKETS,
    maxFreeSockets: MAX_FREE_SOCKET,
    timeout: TIME_OUT_KEEP_ALIVE,
    freeSocketTimeout: FREE_SOCKET_TIMEOUT,
  });
}

/**
 * Create the configuration for agent https
 * @returns {Agent}
 */
function createHttpsAgent() {
  return new Agent.HttpsAgent({
    maxSockets: MAX_SOCKETS,
    maxFreeSockets: MAX_FREE_SOCKET,
    timeout: TIME_OUT_KEEP_ALIVE,
    freeSocketTimeout: FREE_SOCKET_TIMEOUT,
  });
}

/**
 * Create an instance with axios
 * @param {CreateAxiosDefaults} config
 * @returns {AxiosInstance}
 */
function createInstanceHttpClient(
  config?: CreateAxiosDefaults & { retry: number }
): AxiosInstance {
  const { retry, ...axiosConfig } = config || {};
  const httpClient = axios.create({ timeout: TIMEOUT_DEFAULT, ...axiosConfig });

  httpClient.defaults.httpAgent = createHttpAgent();
  httpClient.defaults.httpsAgent = createHttpsAgent();

  axiosRetry(httpClient, {
    retries: RETRIES_DEFAULT || retry,
    retryDelay: exponentialDelay,
  });

  return httpClient;
}

/**
 * Create an instance for the circuit breaker
 * @param {Promise} handler
 * @param {Options} options
 * @returns CircuitBreaker
 */
function createCircuitBreaker(handler, options: Options = {}) {
  const circuitBreaker = new CircuitBreaker(handler, {
    resetTimeout: RESET_TIMEOUT_CIRCUIT_BREAKER,
    errorThresholdPercentage: ERROR_THRESHOLD_PERCENT_CIRCUIT_BREAKER,
    errorFilter: (err) => err.statusCode === 401 || err.statusCode === 403,
    ...options,
  });

  circuitBreaker.on("halfOpen", () => {
    logger.warn(
      `The Circuit Breaker is "HALF_OPEN" for "${circuitBreaker.name}"`
    );
  });

  circuitBreaker.on("open", () => {
    logger.warn(`The Circuit Breaker is "OPEN" for "${circuitBreaker.name}"`);
  });

  circuitBreaker.on("close", () => {
    logger.info(`The Circuit Breaker is "CLOSE" for "${circuitBreaker.name}"`);
  });

  return circuitBreaker;
}

export { createInstanceHttpClient, createCircuitBreaker };        

Api Versioning

Use versioning in the main routes of the endpoints for future updates that could be breaking changes

Versioning can be applied in various ways

  • In the URL by query param api/my-service?v=1
  • In the PATH api/{version}/my-service
  • Via header “x-api-version:1”

To handle multiples versions in the code, split by folder v1, v2 to easy access in your code and apply the adapter pattern to retrocompatibility versions in mappers, DTOs

Cache

Keep in mind cache policies for services, either using in-memory cache or using a db

Apply for the response ‘cache-control’ header to avoid the client fetch the same result in a short period of time

Testing /?Mocks

Test the e2e flow for the services, validating incoming requests, middlewares, errors a powerful tool for Node.js is supertest

Mock any middleend or external services if the service is not finished yet, using tools like https://mockoon.com

Communication between services using pub /?sub

Use for example SNS from Amazon to subscribe external events for other apps to react in the current App, like changes in prices, items, loyalty

Parallel Request

Try to use parallel requests to improve performance when possible and are not dependent

To limit the concurrent request, use a library like p-map

EventEmitter for custom?events

Use the EventEmitter class from Node.js to built custom events to handle custom events

const EventEmitter = require("events");

/**
 * Example using EventEmitter to notify custom events
 * The class extends from EventEmitter to use the methods
 * emit: Emit the custom event with the payload
 * on: Handler to listen any custom event, similar to a eventListener in Web
 */
class Reviews extends EventEmitter {
  constructor() {
    super();
    this.reviews = [];
    this.events = {
      REVIEW_IS_ANSWERED: "review_is_answered",
      REVIEW_IS_REMOVED: "review_is_removed",
    };
  }

  add(review) {
    setTimeout(() => {
      this.reviews.push(review);
      this.emit(this.events.REVIEW_IS_ANSWERED, review);
    }, 2000);
  }

  remove(id) {
    setTimeout(() => {
      this.reviews = this.reviews.filter((reviews) => reviews.id !== id);
      this.emit(this.events.REVIEW_IS_REMOVED, id);
    }, 2500);
  }
}

// Create the reviews
const reviews = new Reviews();

// Add two reviews
reviews.add({ id: 1, message: "The food is correct" });
reviews.add({ id: 2, message: "The food is bad" });

// Listener the custom event when the review is answered
reviews.on(reviews.events.REVIEW_IS_ANSWERED, (newReview) => {
  console.log("the review was answered", newReview);
});        

Promisify legacy code with callbacks

Use the promisify helper from Node to convert callbacks to promises to avoid callback hells

const { promisify } = require("util");

// Example legacy callback function with schema callback(err, response)
const legacyFunctionAPI = (name, callback) => {
  setTimeout(() => {
    if (Math.random() < 0.5) {
      callback(null, { id: 2, name });
    } else {
      callback(new Error("The API has fail"), null);
    }
  }, 1000);
};

// Convert the legacy callback to a Promise
const promiseLegacyFunctionAPI = promisify(legacyFunctionAPI);

// Example using the legacy
legacyFunctionAPI("victoria", (err, response) => {
  if (err) {
    console.log("Error with legacy callback", err);
  } else {
    console.log("Response with legacy callback", response);
  }
});

// Example using the promisify helper
(async () => {
  try {
    const response = await promiseLegacyFunctionAPI("victoria");
    console.log("Response with promise", response);
  } catch (err) {
    console.log("Error with promise", err);
  }
})();        

Review your DockerFile

Review the size of your image, use different layers to avoid copy all the content

Example DockerFile for an App with ts

FROM node:16.13.0-alpine AS base

FROM base AS build
WORKDIR /usr/src/app
COPY . .
RUN npm ci --production
RUN npm run build

FROM base AS publish
COPY --from=build /usr/src/app/package.json /usr/src/app/package.json
COPY --from=build /usr/src/app/dist /usr/src/app/dist
COPY --from=build /usr/src/app/node_modules /usr/src/app/node_modules
WORKDIR /usr/src/app
EXPOSE 80
ENV NODE_ENV=production
CMD ["node", "dist/index.js"]


# Build the image -> docker build -t example-node-js --no-cache .
# Run the image -> docker run --env PORT=80 --env CUSTOM_ENV=example -p 80:80 example-node-js        

Use CHANGELOG file

Expose a CHANGELOG.md or HISTORY.md to know what changes are between versions for external users.

Is possible to do automatic using tools auto-changelog or release-please

Apply Compression

It is recommended to use compression in requests and responses to lower the bandwidth, it’s posible to handle in the application or via nginx or other api gateway.

Audit packages

Periodically review updates of the packages used, to avoid vulnerability problems, it’s possible to apply this command in pre-hooks in github

Use any of this

Version of Node?LTS

Be up to date with the chosen node version releases

Force using the correct version for development using nvm,?.nvmrc

v16.14.0        

Force in your package.json the engine

"engines" : { 
    "node" : ">=16.14.0 =<18.0.0" 
}        

Logger

Track your logs in the application using a external library like winston, pino, bunyan and connect with a external provider like newrelic, datadog

Stay update with?news

Follow this blogs / repositories for new changes in the community


Christian Contarino

Tech Lead - Architect - React js / Next JS / Nest JS / Node js / Typescript

1 年

Buenísimo post gracias Matias Daniel Torre

回复
Maximiliano Kalaydjian

Software Engineer @ Glovo (Delivery Hero Group)

1 年

A big shout-out and kudos to you for the insightful post Mati!

回复
Franco Sammartino

Senior Full Stack Engineer | Typescript | React | PostgresQL

1 年

Buenazo!, te recomiendo el uso de Snipped para darle Syntax color a los Code snippets. Gracias por el aporte!

回复
Diego Hernan Fourmentel

Full Stack Developer | Backend Developer

1 年

Grande Nave!!!!

回复

要查看或添加评论,请登录

社区洞察

其他会员也浏览了