npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

batch-writer

v0.0.2

Published

Manages stream flow and maximizes resource utilization for write streams performing operations in batches

Downloads

2

Readme

batch-writer

npm (scoped) npm bundle size (minified) Build Status codecov

This module provides a write stream which uses the maximum number of parallel write operations and the selected batch size to keep the stream moving while preventing node from firing too many writes at a time. When the maximum number of operations have been reached the stream is paused until one of the operations complete. This write stream implementation can optionally be configured in object mode.

Instantiation

new BatchWriter(batchSize, parallelOperations, writeOperation, streamOptions)
  • batchSize: Maximum size of a batch of chunks, or objects, before starting a new write operation.

  • parallelOperations: Maximum number of write operations allowed to run in parallel before pausing the stream. (i.e.- number of database connections)

  • writeOperation: The write operation for each batch of data. This must be a function with one parameter data wrapped in a Promise to signal when the write operation has completed or failed. Errors should be passed through a Promise.reject(err) which will then be emitted to the stream. On success resolve() should be called to allow another operation to begin if available. Any data returned in the resolve will be ignored so no more processing can be done after the write operation for that batch of data.

    async (data) => // The array of data is available in the one (only) parameter
    new Promise(async (resolve, reject) => { 
        // Perform write operation within a promise, resolve() on success and reject(err) to emit the error in stream
    });
  • streamOptions (optional): These are the stream options used to configure the base write stream implementation. By default, this will be set to {}. A common use of these options would be to set the stream to object mode.

Example Implementation

// Batch 1000 data objects per insert
const BATCH_SIZE = 1000;
// Limit insert operations to 5 at a time
const PARALLEL_OPS = 5;
// Turn on object mode (optional configuration)
const streamOptions = {
    objectMode: true
};

// Create enough connections for the maximum number of simultaneous write operations
const pool = mysql.createPool({
    connectionLimit: PARALLEL_OPS,
    ...
});

// Set up a write operation with one parameter for the array of data
const writeOp = async (data) =>
    new Promise(async (resolve, reject) => {
        const insertQuery = /* Format data into insert statement */;
        pool.getConnection((err, con) => {
            if (err) throw err;
            con.query(insertQuery)
                .on('error', err => {
                    console.error(err);
                    con.release();
                    reject(err);
                })
                .on('result', result => {
                    console.log(result);
                    con.release();
                    resolve();
                });
        });
    });

const dbWriter = new BatchWriter(BATCH_SIZE, PARALLEL_OPS, writeOp, streamOptions);

const readStream = /* Create read stream to obtain data to insert */;

// Pipe the read stream through the write stream to batch up the inserts
// into parallel operations
readStream.pipe(dbWriter).on('finish', () => {
    console.log('Writing Complete!');
});