npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

pg-ez

v1.0.2

Published

Library for simplifying and enhancing the node-postgres (pg) API

Downloads

4

Readme

pg-ez - node-postgres made easy

node-postgres is not particularly difficult to use and has a well-documented API, but as that documentation states, "node-postgres strives to be low level an un-opinionated." pg-ez, on the other hand, strives to be high-level and (more) opinionated, allowing you to get up and querying within seconds rather than minutes. It uses best practices as recommend by node-postgres, so you don't have to concern yourself with things like releasing clients back to a pool.

Installation

npm install pg-ez

Testing

npm test

The tests test pg-ez's integration with node-postgres but run quickly. The tests require you to have defined environment variables for PGUSER, PGPASSWORD, PGHOST, PGPORT, and PGDATABASE.

Documentation

Establishing a database connection

Requiring pg-ez and establishing a database connection is done in a single line. Like pg, you can pass to it a connection string or a connection object; if you pass neither, pg-ez will, like pg, try to establish a connection using environment variables.

Example 1: passing in a connection string

const db = require('pg-ez')('postgresql://admin:sekrit@localhost:5432/mydb');

Example 2: passing in a connection object

const db = require('pg-ez')({user: 'admin', password: 'sekrit', host: 'localhost', port: 5432, database: 'mydb'});

Example 3: passing in nothing

// NOTE: requires that there are defined environment variables for  PGUSER, PGPASSWORD, PGHOST, PGPORT, and PGDATABASE
const db = require('pg-ez')();

Queries

Querying in pg-ez is nearly the same as querying in pg: simply call the exec method and pass to it a query string and parameters, or pass to it a query configuration object. Like pg, pg-ez supports 3 flavors of asynchronous querying: async / await, promises, and callbacks.

Example 1: using async / await

(async () => {
  try {
    const result = await db.exec('SELECT $1::VARCHAR AS first_name, $2::VARCHAR AS last_name, $3::INT AS age', ['Peter', 'Gibbons', 32]);
    console.log(result.rows);
  } catch (err) {
    console.error('ERR: ' + err);
  }
})();

Example 2: using promises

db.exec('SELECT $1::VARCHAR AS first_name, $2::VARCHAR AS last_name, $3::INT AS age', ['Peter', 'Gibbons', 32])
  .then(result => {
    console.log(result.rows);
  })
  .catch(err => {
    console.error('ERR: ' + err);
  });

Example 3: using callbacks

db.exec('SELECT $1::VARCHAR AS first_name, $2::VARCHAR AS last_name, $3::INT AS age', ['Peter', 'Gibbons', 32], (err, result) => {
  if (err) console.error('ERR: ' + err);
  else console.log(result.rows);
});

Streams

Big data can bring big problems. If you have a query yielding millions of rows, you probably don't want to put the query results into memory and thereby spike your memory usage. Streams to the rescue! The stream method returns a native promise, not a stream; however, this particular promise supports a pipe method, allowing you to pass data through and chain together pipes just as though you were dealing with a stream. An error thrown at any point in the pipeline will propagate and can be caught—as any promise error can be—with a catch method (if using promises) or a try / catch block (if using async / await).

Example 1: stream JSON transform of results to http response

const JSONStream = require('JSONStream');
const http = require('http');
http.createServer((req, res) => {
  res.setHeader('Content-Type', 'application/json');
  db.stream('SELECT generate_series(0, $1, 1) x, generate_series(0, $1, 2) y', [1000])
    .pipe(JSONStream.stringify())
    .pipe(res);
}).listen(1337);

Example 2: stream comma-delimited transform of results to CSV file

const csvStream = require('csv-write-stream')({headers: ['x', 'y']});
const fs = require('fs');
const fileStream = fs.createWriteStream('./query-output.csv');

db.stream({text: 'SELECT generate_series(0, $1, 1) x, generate_series(0, $1, 2) y', values: [1000], rowMode: 'array'})
  .pipe(csvStream)
  .pipe(fileStream)
  .then(() => {
    console.log('Streaming complete!');
  })
  .catch(err => {
    console.error('ERR: ' + err);
  });

Transactions

Transactions are implemented intuitively: simply wrap all your desired statements within a transaction "block." The transaction method returns a native promise, so you can do follow-up processing with then() , or you can use await if your transaction invocation is inside an async function. An error in any query within the transaction block will automatically trigger a rollback, but because transaction returns a promise, you can catch the error to perform additional error handling.

Example 1: Using async / await

(async () => {
  try {
    await db.transaction(async (client) => {
      // NOTE: It's important that you execute queries against the client passed in as the lone argument to this callback function
      await client.exec('CREATE TEMP TABLE pg_ez_test_transaction (id SERIAL, first_name VARCHAR(255), last_name VARCHAR(255))');
      await client.exec('INSERT INTO pg_ez_test_transaction (first_name, last_name)  VALUES ($1, $2)', ['Michael', 'Bolton']);
    });
    console.log('Done!');
  } catch (err) {
    console.error('ERR: ' + err);
  }
})();

Example 2: Using promises

db.transaction(async (client) => {
  // NOTE: It's important that you execute queries against the client passed in as the lone argument to this callback function
  await client.exec('CREATE TEMP TABLE pg_ez_test_transaction (id SERIAL, first_name VARCHAR(255), last_name VARCHAR(255))');
  await client.exec('INSERT INTO pg_ez_test_transaction (first_name, last_name)  VALUES ($1, $2)', ['Michael', 'Bolton']);
})
.then(function() {
  console.log('Done!');
})
.catch(function(err) {
  console.error('ERR: ' + err);
});