ODK Postgres Database Migration and Backup: A Simple Node.js Script

ODK Postgres Database Migration and Backup: A Simple Node.js Script

A simple node.js migration script can help you migrate your ODK Postgres database from one database server to another.

Keywords: Open Data Kit, ODK, Database, Postgres, Migration, Data backup

Introduction

Nowadays, most cloud database providers offer a 7-day backup and restore service. These services often restore their backup in a new database instance instead of the running ODK Postgres instance. Imagine you are running important surveys and need to back up the survey submissions. So that you know – the backup is 'backing up' according to your plan.

We also tend to change the database provider. A lot more than you may think. Both for migration and backing up the databases, here is a handy Node.js script that you can use.

Script File

Copy the script below and create a js file `db-migration.js`. Alternatively, you can download the file from GitHub Gist here.

import { exec } from "child_process";
import pkg from "pg";
import { promisify } from "util";
const execPromise = promisify(exec);
const { Client } = pkg;

// Configuration
const sourceConfig = {
  host: "SOURCE_DATABASE_HOST",
  port: 5432,
  user: "SOURCE_DATABASE_USER",
  password: "SOURCE_DATABASE_PASSWORD",
  database: "postgres", // Connect to the default database to list all databases
  ssl: {
    rejectUnauthorized: false, // This might be necessary for Azure connections
  },
};

const targetConfig = {
  host: "TARGET_DATABASE_HOST",
  port: 5432,
  user: "TARGET_DATABASE_USER",
  password: "TARGET_DATABASE_PASSWORD",
  database: "postgres", // Connect to the default database to create new databases
  ssl: {
    rejectUnauthorized: false, // This might be necessary for Azure connections
  },
};

const pgDumpPath = "pg_dump"; // Ensure pg_dump is in your PATH
const psqlPath = "psql"; // Ensure psql is in your PATH

async function listUserOwnedDatabases(config) {
  const client = new Client(config);
  await client.connect();
  const result = await client.query(
    `
    SELECT d.datname 
    FROM pg_database d
    JOIN pg_roles r ON d.datdba = r.oid
    WHERE r.rolname = $1 AND d.datistemplate = false;
  `,
    [config.user]
  );
  await client.end();
  return result.rows.map((row) => row.datname);
}

async function createDatabase(config, dbName) {
  const client = new Client(config);
  await client.connect();
  await client.query(`CREATE DATABASE "${dbName}";`);
  await client.end();
}

async function dumpDatabase(config, dbName) {
  const dumpFile = `${dbName}.sql`;
  const command = `${pgDumpPath} -h ${config.host} -p ${config.port} -U ${config.user} -d ${dbName} -f ./dumps/${dumpFile}`;
  await execPromise(command, {
    env: { ...process.env, PGPASSWORD: config.password },
  });
  return dumpFile;
}

async function restoreDatabase(config, dbName, dumpFile) {
  const command = `${psqlPath} -h ${config.host} -p ${config.port} -U ${config.user} -d ${dbName} -f ./dumps/${dumpFile}`;
  await execPromise(command, {
    env: { ...process.env, PGPASSWORD: config.password },
  });
}

async function migrateDatabase(dbName) {
  console.log(`Migrating database: ${dbName}`);

  // Step 1: Create a dump for database in local
  const dumpFile = await dumpDatabase(sourceConfig, dbName);
  console.log(`Dump created: ${dumpFile}`);

  // Step 2: Create the db with same name in RDS
  await createDatabase(targetConfig, dbName);
  console.log(`Database created in RDS: ${dbName}`);

  // Step 3: Load the dump in RDS
  await restoreDatabase(targetConfig, dbName, dumpFile);
  console.log(`Database restored in RDS: ${dbName}`);
  console.log("\n");
}

async function main() {
  try {
    const databases = await listUserOwnedDatabases(sourceConfig);
    console.log("Databases to migrate: ", databases.length);

    for (const dbName of databases) {
      if (
        dbName !== "postgres" &&
        dbName !== "template0" &&
        dbName !== "template1"
      ) {
        await migrateDatabase(dbName);
      }
    }

    console.log("Migration completed successfully.");
  } catch (error) {
    console.error("An error occurred during migration:", error);
  }
}

main();
        

Provide the credentials for source and target database servers. The target database server should not already contain databases with the same names.

For this, it is useful to have the Postgres installed in your local system natively - that is, not through Docker. They provide access to pg_dump and psql to the node runtime.

The script uses only one npm dependency - `pg` which you can install using

npm i pg         

In the script's path create a folder

mkdir dumps        

Then run the script

node db-migration.js        

Conclusion

The script will log in to the remote database server to take the pg_dump and then migrate to the new database server to create databases one by one. Because the script also migrates the system databases (e.g., postgres), you will not get errors like user or roles not found. However, if you are migrating to a running ODK Central instance having its root admin, the database will replace that existing root admin. If you want to regain the root admin, you can recreate the root admin user using docker compose (provided in ODK Central Installation guide here):

docker compose exec service odk-cmd --email [email protected] user-create

docker compose exec service odk-cmd --email [email protected] user-promote        

要查看或添加评论,请登录

Atanu Garai的更多文章

社区洞察

其他会员也浏览了