npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

stream-chopper

v3.0.1

Published

Chop a single stream of data into a series of readable streams

Downloads

1,299,201

Readme

stream-chopper

Chop a single stream of data into a series of readable streams.

npm build status codecov js-standard-style

Stream Chopper is useful in situations where you have a stream of data you want to chop up into smaller pieces, either based on time or size. Each piece will be emitted as a readable stream (called output streams).

Possible use-cases include log rotation, splitting up large data sets, or chopping up a live stream of data into finite chunks that can then be stored.

Control how data is split

Sometimes it's important to ensure that a chunk written to the input stream isn't split up and devided over two output streams. Stream Chopper allows you to specify the chopping algorithm used (via the type option) when a chunk is too large to fit into the current output stream.

By default a chunk too large to fit in the current output stream is split between it and the next. Alternatively you can decide to either allow the chunk to "overflow" the size limit, in which case it will be written to the current output stream, or to "underflow" the size limit, in which case the current output stream will be ended and the chunk written to the next output stream.

Installation

npm install stream-chopper --save

Usage

Example app:

const StreamChopper = require('stream-chopper')

const chopper = new StreamChopper({
  size: 30,                    // chop stream when it reaches 30 bytes,
  time: 10000,                 // or when it's been open for 10s,
  type: StreamChopper.overflow // but allow stream to exceed size slightly
})

chopper.on('stream', function (stream, next) {
  console.log('>> Got a new stream! <<')
  stream.pipe(process.stdout)
  stream.on('end', next) // call next when you're ready to receive a new stream
})

chopper.write('This write contains more than 30 bytes\n')
chopper.write('This write contains less\n')
chopper.write('This is the last write\n')

Output:

>> Got a new stream! <<
This write contains more than 30 bytes
>> Got a new stream! <<
This write contains less
This is the last write

API

chopper = new StreamChopper([options])

Instantiate a StreamChopper instance. StreamChopper is a writable stream.

Takes an optional options object which, besides the normal options accepted by the Writable class, accepts the following config options:

  • size - The maximum number of bytes that can be written to the chopper stream before a new output stream is emitted (default: Infinity)
  • time - The maximum number of milliseconds that an output stream can be in use before a new output stream is emitted (default: -1 which means no limit)
  • type - Change the algoritm used to determine how a written chunk that cannot fit into the current output stream should be handled. The following values are possible:
    • StreamChopper.split - Fit as much data from the chunk as possible into the current stream and write the remainder to the next stream (default)
    • StreamChopper.overflow - Allow the entire chunk to be written to the current stream. After writing, the stream is ended
    • StreamChopper.underflow - End the current output stream and write the entire chunk to the next stream
  • transform - An optional function that returns a transform stream used for transforming the data in some way (e.g. a zlib Gzip stream). If used, the size option will count towards the size of the output chunks. This config option cannot be used together with the StreamChopper.split type

If type is StreamChopper.underflow and the size of the chunk to be written is larger than size an error is emitted.

Event: stream

Emitted every time a new output stream is ready. You must listen for this event.

The listener function is called with two arguments:

  • stream - A readable output stream
  • next - A function you must call when you're ready to receive a new output stream. If called with an error, the chopper stream is destroyed

chopper.size

The maximum number of bytes that can be written to the chopper stream before a new output stream is emitted.

Use this property to override it with a new value. The new value will take effect immediately on the current stream.

chopper.time

The maximum number of milliseconds that an output stream can be in use before a new output stream is emitted.

Use this property to override it with a new value. The new value will take effect when the next stream is initialized. To change the current timer, see chopper.resetTimer().

Set to -1 for no time limit.

chopper.type

The algoritm used to determine how a written chunk that cannot fit into the current output stream should be handled. The following values are possible:

  • StreamChopper.split - Fit as much data from the chunk as possible into the current stream and write the remainder to the next stream
  • StreamChopper.overflow - Allow the entire chunk to be written to the current stream. After writing, the stream is ended
  • StreamChopper.underflow - End the current output stream and write the entire chunk to the next stream

Use this property to override it with a new value. The new value will take effect immediately on the current stream.

chopper.chop([callback])

Manually chop the stream. Forces the current output stream to end even if its size limit or time timeout hasn't been reached yet.

Arguments:

  • callback - An optional callback which will be called once the output stream have ended

chopper.resetTimer([time])

Use this function to reset the current timer (configured via the time config option). Calling this function will force the current timer to start over.

If the optional time argument is provided, this value is used as the new time. This is equivilent to calling:

chopper.time = time
chopper.resetTimer()

If the function is called with time set to -1, the current timer is cancelled and the time limit is disabled for all future streams.

License

MIT