npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@continuum/node-pg-migrate

v0.5.6

Published

Synchronous (async/await) migration assistant for PostgreSQL

Downloads

7

Readme

node-pg-migrate

npm url downloads build badge greenkeeper codecov

This is a cli module that assists in creating and running Postgres Migrations using async/await.

How it works.

This module will generate template migration files in a directory of your choosing that contain empty function closures for up and down migrations. These closure will be given a knex client connection per your provided data base configuration.

Running a migration creates a table on your database's public schema called node_pg_migrate that tracks the state of any migrations executed.

You'll notice all migrations generated are async functions. This does mean at least Node 7 is required. This is because the module will parse and sort migrations into a sequence and run them synchronously. This gives you, the migration author, control over what actually comprises your migration and allows you to capture errors for 'optional' or failure-allowed migrations.

Why another migration library.

I (@ktstowell) personally dislike the fragmentation of javascript libraries, however I felt compelled to make this based on the following reasons:

  • async/await
    • Every other library I found was still using promises, and even a single global promise per execution instance - I weighed the cost of trying to PR a change to those libraries versus making a new library from scratch. Obviously I favoured making a new library.
  • Exception management
    • At the time i was looking for a migration library, I needed something that would let me choose how to handle exceptions in my migrations. It seemed that even if I attempted to catch/swallow them the runner would still abort the process.
  • Exclusions
    • I needed the ability in some circumstances to exclude some migrations from running. The use case was, locally, using docker, I didn't need to run certain low-level setup migrations as docker does that for you if you provide the right environment variables. I'm also very adverse to having anything like if env==='local' in my code. That should be the responsibility of whatever is orchestrating the migrations. Now in my npm scripts I have npm run migrate and npm run migrate:local so the code itself remains environment agnostic.
  • Schema vs Data
    • In many situations the need to decouple migration a schema change versus data is important. I wanted to be able to represent that structurally. With our types api you can specificy what kinds of migrations you want and it will create sub folders for them.
  • High Availability.
    • Many systems implement a migration process that involves provisioning a new database on the fly, running migrations on the new target and then importing existing data from the source. I wanted to represent this as well with our mode:'high-availability' option. This will inject a target and source connection into your migration files.

Installation

npm install @continuum/node-pg-migrate

Usage

Add "npgm": "node ./node_modules/@continuum/node-pg-migrate" to the scripts section of your application's package.json so you don't have to type the whole binary path.

Contributing

If you see any issues feel free to report them to the issues tab on this repo. If you have an improvement/change you'd like to make please submit a PR.

If approved and the PR is merged to master, we will determine where the change falls under the Semver convention and update the package version and publish accordingly.

Configuration

  • .npgmrc files are supported and can contain the following global options:
    • ordering:
      • sequential: default option. Will create files numbered as 01.<name>.migration.js
      • timestamp: Will create files as <Date.now()>.<name>.migration.js
    • directory: folder where your migrations will live. Will be created for you if it doesn't exist.

Database Configuration

Database connection config can be loaded in one of two ways:

  • Environment variables:

    • We use the dotenv package to load any .env that exists in process.cwd() - which is the root of the project you are invoking this module from. We support the following variable names:
    POSTGRES_HOST=
    POSTGRES_PORT=
    POSTGRES_DB= | POSTGRES_DATABASE=
    POSTGRES_PASSWORD=
    POSTGRES_USER=
    POSTGRES_SCHEMA=
  • Alternatively, we accept command line arguments as well that will be used before any environment variables if provided.

    • --user
    • --host
    • --port
    • --database
    • --password
    • --schema
  • You can also just pass these in as --connection as a postgres connection string.

Commands

Global Arguments

These arguments can be passed as --<arg name> to all cli methods.

| Name | values | required | default | |---|---|---|---| | user | db user name | yes, here or in .env | null | | host | db host | yes, here or in .env | null | | port | db port | yes, here or in .env | null | | password | db password | yes, here or in .env | null | | schema | db schema | yes, here or in .env | null | | connection | db conn string | this or as above args | null |


Create a migration

  • npm run npgm create (<name> | --name)

| Name | values | required | default | |---|---|---|---| |ordering | sequential,timestamp | no | sequential | |directory | path to your migrations dir | yes, can be in .rc | 'migrations' | |types | schema, data | no | schema | |mode | standard, high-availability | no | 'standard' |

Create a migration before another existing migration

  • npm run npgm create -- --name <name> --before <othername>

Run UP migrations

  • npm run npgm up

| Name | values | required | default | |---|---|---|---| |include | any specific migrations to run, if empty will run all. Use one --flag per migration | no | [] | |exclude | any specific migrations to ignore, if empty will run all. Use one --flag per migration | no | [] |

Up for Multiple Types

We recommend using an .npgmrc file so your configuration can be consistent across method calls. If you have created your migrations with both schema and data types, when you run up it will execute your migration types in the order they were provided, ie: ["schema", "data"] or --types=schema --types=data, will both run schema migrations first.

This is technically because, the migrate create config can support n number of types should you choose. It's actually not coupled to the words schema or data.

Run DOWN migrations

  • npm run npgm down

| Name | values | required | default | |---|---|---|---| |directory | path to your migrations dir | yes, can be in .rc | 'migrations' | |include | any specific migrations to run, if empty will run all. Use one --flag per migration | no | [] | |exclude | any specific migrations to ignore, if empty will run all. Use one --flag per migration | no | [] |

See up instructions for multiple types.

If you have set multiple types in your configuration, down migrations will run them in reverse order as well as the migrations.

Reset migrations

  • npm run npgm reset

Will simply drop the migrations table from your database. DOES NOT run down migrations. This is typicall most useful in a development setting.

Example - Standard

In your repository where migrations will be stored:

npm run npgm create foo

This will create:

./migrations/<timestamp>.foo.js

The file will contain:

'use strict';

/***********************************************************************************************************************************************
 * NODE DB MIGRATE - FOO
 ***********************************************************************************************************************************************
 * @author File generated by @liveaxle/node-pg-migrate
 * @description
 * 
 */

/**
 * [exports description]
 * @type {Object}
 */
module.exports = {
  up, down
};

/**
 * [up description]
 * @return {[type]} [description]
 */
async function up(client) {
  
}

/**
 * [down description]
 * @return {[type]} [description]
 */
async function down(client) {

}

In each of the above functions you now can write whatever the "foo" migration means to your application. Down migrations should be the opposite, in action and order, from what your up migrations are.

Then, at some later point when you actually need to execute your migrations, you can run npm run npgm up and it will execute the sql in the up closure above in sequence.

Example - High Availability

npgm create foo --mode=high-availability

'use strict';

/***********************************************************************************************************************************************
 * NODE DB MIGRATE - FOO
 ***********************************************************************************************************************************************
 * @author File generated by @liveaxle/node-pg-migrate
 * @description
 * 
 */

/**
 * [exports description]
 * @type {Object}
 */
module.exports = {
  up, down
};

/**
 * [up description]
 * @return {[type]} [description]
 */
async function up(target, source) {
  
}

/**
 * [down description]
 * @return {[type]} [description]
 */
async function down(target, source) {

}

The intent of this mode is to inject your two db clients into to your migration closure to enable you to export data from the target and apply it to the source.

Example - Schema and Data

npgm create foo --types=data --types=schema

This will create a migration folder structure like:

/migrations
  /data
    <ordering>.foo.migration.js
  /schema
    <ordering>.foo.migration.js