accum
v0.3.7
Published
accum - Simple write stream which accumulates or collects the data from a stream. Pipe your stream into this to get all the data as buffer, string, or raw array. (streams2)
Downloads
968
Maintainers
Readme
accum
Simple write stream which accumulates or collects the data from a stream. Pipe your stream into this to get all the data as buffer, string, or raw array. (streams2)
Installation
npm install accum
Usage
accum
provides several factory methods for use:
- The default automatic method -
accum([options], listenerFn)
constructs a write stream which checks if the first chunk is a Buffer and if so returns a concatenated Buffer of all the data, otherwise if it is a string then returns a concatenated string, otherwise returns a raw array. ThelistenerFn
signature isfunction(alldata)
. ThelistenerFn
is called after all the data is received just prior to theend
event being emitted. Theoptions
parameter is passed to the underlying writable stream constructor (see Node.js stream docs. Theoptions
parameter can be omitted.
var accum = require('accum');
rstream
.pipe(accum(function (alldata) {
// use the accumulated data - alldata will be Buffer, string, or []
}));
For a more deterministic result use one of the following:
accum.buffer([options], listenerFn)
- constructs a write stream which converts everything into a Buffer, concatenates, and calls thelistenerFn
with the buffer. ThelistenerFn
signature isfunction(buffer)
. ThelistenerFn
is called after all the data is received just prior to theend
event being emitted. Theoptions
parameter is passed to the underlying writable stream constructor (see Node.js stream docs. Theoptions
parameter can be omitted.
var accum = require('accum');
rstream
.pipe(accum.buffer(function (buffer) {
// use the accumulated data - buffer which is a Buffer
}));
accum.string([options], listenerFn)
- constructs a write stream which concatenates everything into a string. Buffer data is converted to string using the optionalencoding
which defaults to 'utf8'. Other data is simply converted using.toString()
. ThelistenerFn
signature isfunction(string)
. ThelistenerFn
is called after all the data is received just prior to theend
event being emitted. Theoptions
parameter is passed to the underlying writable stream constructor (see Node.js stream docs. Theoptions
parameter can be omitted.
var accum = require('accum');
rstream
.pipe(accum.string({ encoding: 'utf8'}, function (string) {
// use the accumulated data - string which is a utf8 string
}));
accum.array([options], listenerFn)
- constructs a write stream which concatenates everything into an array without any conversion, which thelistenerFn
receives the accumulated data on end. ThelistenerFn
signature isfunction(arr)
. ThelistenerFn
is called after all the data is received just prior to theend
event being emitted. Theoptions
parameter is passed to the underlying writable stream constructor (see Node.js stream docs. Theoptions
parameter can be omitted.
var accum = require('accum');
rstream
.pipe(accum.array(function (array) {
// use the accumulated data - array which is a raw unconverted array of data chunks
}));
If you are intending to use an object stream, then you will want to set the options
to {objectMode: true}
var accum = require('accum');
rstream
.pipe(accum.array({ objectMode: true }, function (array) {
// use the accumulated data - array of objects
}));
Error handling
Node.js stream.pipe does not forward errors and neither do many pass-through stream implementations so the recommended way to catch errors is either to attach error handlers at each stream or to use domains.
var d = domain.create();
d.on('error', handleAllErrors);
d.run(function() {
rstream.pipe(accum(function (alldata) {
// use alldata
});
});
Goals
- Easy to use write stream which accumulates the data from a stream
- uses streams2 functionality from node 0.10+ but is backwards compatible with node 0.8
- Ability to automatically adapt to type of first data packet or coerce the data to a particular format
- Ability to receive just the raw array of data chunks
- Tested with binary data
- Handle multibyte utf8 data which may have gotten split across packets in a stream
Why
Rather than manually accumulating streams, put all the best practices into this one module. There are subtle errors that can occur if utf8 strings happen to be split in the middle of a character, so conversion and concatenation needs to be done properly.
Other projects that also accumululate in slightly different ways:
- https://github.com/maxogden/concat-stream
- https://github.com/polotek/data-collector-stream
- https://github.com/Weltschmerz/Accumulate
Tested with Node versions
- 0.8
- 0.10
- 0.11
- 0.12
- 4.x
- 6.x
- 7.x
Get involved
If you have input or ideas or would like to get involved, you may:
- contact me via twitter @jeffbski - http://twitter.com/jeffbski
- open an issue on github to begin a discussion - https://github.com/jeffbski/accum/issues
- fork the repo and send a pull request (ideally with tests) - https://github.com/jeffbski/accum