Blog post

Database Migrations


Database Migrations

Nikola Jovanovic



Working with databases is usually a simple task. This is true if you and your team are all working with the same version of the database. Each time you want to change the database, you create a script within which the change is defined. You upload it to your project's repository, and all of your team members download and apply the script to their development environment, so they are up to date with the changes you made.

Problems can occur once you need to work on multiple different versions of the database. For example, if a new member joins your team and needs to set up a database for them to work with, you would need to create a dump from your database. But your database might contain changes that are subject to change, and you don’t want the new team member to have those changes. So you would need to roll back these changes and give the new member a non-manipulated version of the database.

This is not the easiest way to do this. Some tools allow you to switch between database versions and migrate them easily. We will be focusing on a tool called Postgrator.


Postgrator is a Node.js database migration library for relational databases. It supports Postgres, MySQL, and SQL Server.

You can check out its repository here: rickbergfalk/postgrator .

It is also available as a CLI tool:


You can install Postgrator with NPM. You also need to install the engine of the database which you will use:

1npm init
3npm install postgrator
5// Database engines:
6npm install pg@8
7npm install mysql@2
8npm install mysql2@2
9npm install mssql@6


The way Postgrator works is that it uses a folder in which you store your migration scripts in a certain format, and it uses those scripts to switch to certain versions of your database. Those scripts are what represent database versions. You can use them to update your database, for example, by adding new columns or rows.

Scripts are stored in a folder called migrations, and they are named using a certain format:

2  |-
3  |- 001.undo.sql
4  |-
5  |- 002.undo.optional-description-of-script.sql
6  |-
7  |- 003.undo.sql
8  |-
9  |- 004.undo.js
10  |- ... and so on

As you can see, the format of script names is as following:


Version: this is a number that Postgrator uses to differentiate between different scripts. It can be in any format you want as long as it is a number. The most popular formats are the one shown in the example and ISO 8601.

Action: can only be do or undo. do is used to apply a certain change to the database, and undo is used to roll back that change. Each version has one do, and one undo script.

Description: This is optional, and it is used to define scripts use better. It serves as a label or a tag.

File extension: can be either sql or js. Your scripts can be created using plain SQL, or you can use a JavaScript function to generate SQL. The author recommends using the JavaScript approach because, with it, you can use environment variables. If the script is a JavaScript file, then it must export a function called generateSql which returns an SQL statement, like so:

1module.exports.generateSql = function () {
2  return (
3    "CREATE USER transaction_user WITH PASSWORD '" +
5    "'"
6  )

The way you run Postgrator is by creating an instance of a Postgrator class and giving it an object with some properties:

1const Postgrator = require('postgrator')
3const postgrator = new Postgrator({
4  migrationDirectory: __dirname + '/migrations',
5  driver: 'pg',
6  host: '',
7  port: 5432,
8  database: 'databasename',
9  username: 'username',
10  password: 'password',
11  schemaTable: 'schemaversion',
  • migrationDirectory is used to locate the folder in which you store your migration scripts. You can use migrationPattern instead if you would like it to search for a glob pattern instead of a file path:

1const postgrator = new Postgrator({
2  migrationPattern: __dirname + '/some/pattern/*',
  • driver represents the database engine you are running and can only be: pg, mysql, mysql2 or mssql.

  • host, port, database, username and password are all used for connecting to the central database. You can use connectionString instead to connect to the database using its URL directly:

1const postgrator = new Postgrator({
2  connectionString: 'tcp://username:password@hosturl/databasename',
3  ssl: true
  • schemaTable is used to define the schema table name. It is optional because it has the default value of schemaversion. The author states that if you are using Postgres, the schema can be defined using . as a separator, like so:

1const postgrator = newPostgrator({
2  schemaTable: 'schema_name.table_name'

Now you can use Postgrator to migrate to whichever version you want as long as it is included in the migrations folder.

You can migrate to the latest version like so:

2  .migrate('max')
3  .then((appliedMigrations) => console.log(appliedMigrations))
4  .catch((error) => console.log(error))

As you can see, the migration is handled as a promise. The migrate() method takes a single parameter that represents which version you want to migrate to. If you don’t give it a parameter or give it 'max' it will migrate to the latest version available.

If you pass any other string, it will try to find a script using the passed in the parameter as the version number:

2  .migrate('002')
3  .then((appliedMigrations) => console.log(appliedMigrations))
4  .catch((error) => {
5    console.log(error)
6    console.log(error.appliedMigrations)
7  })

appliedMigrations is used to decorate the error object, and it represents an array of all migration objects which were applied up until the error was encountered.

The migrations work is that Postgrator uses the schemaTable object as the current version of the database and compares the migration scripts. In that way, it determines which action type it needs to use: do or undo. Postgrator applies migration scripts and updates schemaTable accordingly until the version specified to migrate to is the same as schemaTable. When they are the same, it stops and closes the connection.


Postgrator is very easy to use, and its code base is so clearly written that for some things, you can just read its code to understand what it does. There aren’t many tutorials for it besides its official documentation on GitHub, so that might be a problem, but it's pretty easy to figure out yourself.

It has that wasn’t cover here, such as:

  • event-driven logging

  • docker tests

  • checksum validation.

You can check all of these out on its GitHub repository:


Nikola Jovanovic


Nikola is software engineer with a problem solving mindset, in love with JavaScript and the whole ecosystem. Passionate about frontend and backend work.

See more blogs:

Leave your thought here

Your email address will not be published. Required fields are marked *