elasticsearch-batch-stream
v1.1.5
Published
A write stream that creates batches of elasticsearch bulk operations.
Downloads
5
Maintainers
Readme
elasticsearch-batch-stream
A write stream that creates batches of elasticsearch bulk operations.
Example
The ElasticSearch library has a function to bulk write documents, but since a stream emits a write for each document, we cannot group multiple operations together.
This package wraps the bulk
function in a writestream to help buffer the operations and passing them on as batches to the bulk function. For example, we can now create batches of 500 docs each and reduce the number of API calls to ElasticSearch from 100.000 to 200, which will improve speed.
const docTransformStream = through2.obj(function (chunk, enc, callback) {
// convert chunk => doc
const doc = { index: 'myindex', type: 'mytype', id: '12345', action: 'index', doc: { name: 'test' } }
callback(null, doc)
})
sourceReadStream().pipe(docTransformStream()).pipe(bulkWriteStream({ client, size: 500 }))
Installation
$ npm install elasticsearch-batch-stream
API
bulkWriteStream(options = { client, size })
Creates the write stream to ElasticSearch.
options
The options object argument is required and should at least include the ElasticSearch client object.
client
An instance of the ElasticSearch client i.e. new elasticsearch.Client()
size
Number of stream operations to group together in the bulk command (default = 100).
Maintainers
Contributing
If you would like to help out with some code, check the details.
Not a coder, but still want to support? Have a look at the options available to donate.
License
Licensed under MIT.