promise-pool-js
v1.0.2
Published
A Promise pool implementation supporting sequential execution of promises across a pool, following different strategies.
Downloads
11
Readme
A Promise pool implementation supporting sequential execution of promises across a pool.
Table of contents
Installation
Using NPM
npm install --save promise-pool-js
Description
This module provides a promise pool implementation which allows developers to sequentially execute multiple functions returning a promise, across a promise pool of dynamic size.
Use-cases associated with this module can be multiple and range from operations such as rate limiting (e.g when it is necessary to throttle the amount of concurrent requests issued against a given service), to basic sequential promise execution, segmentation of execution of promises, etc.
Usage
To include the promise-pool-js
module into your application, you must first include it as follow.
const Pool = require('promise-pool-js');
Instantiating the pool
The promise pool can be instanciated using the constructor function returned by require
.
const pool = new Pool(5);
Introducing strategies
In order to allow users of this library to choose how to balance the execution of promises within the pool, the strategy pattern has been used to inject external behaviors at runtime. There are 3 built-in strategies already implemented, but you can also provide your own implementation in the context of advanced use-cases.
Round-robin strategy
This is the default strategy which is loaded by the pool when no strategies have been specified. Its behavior is simple, promises will be sequentially inserted in the pool starting from the first promise in the pool to the latest, while looping to the first one once every promises have been used.
Note that while the insertion is sequential, the execution of the promises may not be sequential as this depends on the type of process your promises are executing.
If you'd like to explicitely specify the round-robin
strategy, you can do so by passing an option object to the Pool
constructor:
const pool = new Pool({
size: 5,
strategy: 'round-robin'
});
Random strategy
The random
strategy will insert new scheduled promises at a random index in the pool. The distribution of executed promises within the pool mainly depends on the quality of the randomness seed associated with Math.random()
.
const pool = new Pool({
size: 5,
strategy: 'random'
});
Load balancer strategy
The load-balancer
strategy actually computes the amount of load for each promises in the pool by keeping a count of the queued promises on each promise of the pool. This comes in handy when your promises are executing operations in a non-deterministic time (e.g network requests) to optimize the execution of a maximum amount of promises in the smallest possible amount of time.
const pool = new Pool({
size: 5,
strategy: 'load-balancer'
});
Custom strategies
It is possible to provide a custom implementation of a promise scheduler into the pool constructor by passing it the instance of your scheduler.
const opts = { size: 5 };
opts.strategy = new CustomStrategy(opts);
const pool = new Pool(opts);
See Implementing a custom strategy for more details on how to implement a custom strategy compatible with the promise pool.
Scheduling promises
The core of this module is of course to allow scheduling of promises within the pool. To do so, different methods exists that provides different interfaces associated with different use-cases. The next sections will expose minimal working code for each example. For the sake of simplicity, the boilerplate code will be omitted, in order to get the full examples, have a look at the examples directory.
The .schedule()
API
This API makes it possible to execute functions returning promise objects using a fluent interface, and in a fire-and-forget manner. Use this API if you'd like to handle the result of the execution of a promise yourself.
// Spreads 100 promise executions across the pool.
for (let i = 0; i < 100; ++i) {
pool.schedule(promise(i));
}
The .enqueue()
API
This API works like .schedule()
in that it will enqueue a promise execution in the available pool of promises, but unlike .schedule()
it will return a promise which is resolved (or rejected) once the initial promise has been executed.
// Sequentially enqueuing promises using standard `.then()`.
pool.enqueue(promise(1)).then(() => pool.enqueue(promise(2))).then(console.log);
The .enqueueMany()
API
This API works like .enqueue()
but will allow you to provide an array of functions at once, while being able to catch th results yield by each of them.
// Enqueuing multiple promises and gathering their results using standard `.then()`.
// The following statement will output : [1, 2, 3]
pool.enqueueMany([ promise(1), promise(2), promise(3) ]).then(console.log);
Note that passing an array of function to
.enqueue
has the same effect than calling.enqueueMany
.
The .enqueueOnSameExecutor()
API
Sometimes, it is useful to enqueue an array of promises on the same executor, such that it is guaranteed that these promises will be executed sequentially (e.g you would like to run in parallel a sequence of promises which, individually, will each run sequentially within the sequence). To do so, you can use the .enqueueOnSameExecutor()
API as follow.
// Sequentially enqueuing promises using standard `.then()`.
Promise.all([
pool.enqueueOnSameExecutor([ promise(1), promise(2), promise(3) ]),
pool.enqueueOnSameExecutor([ promise(4), promise(5), promise(6) ])
]).then(console.log);
The .all()
API
Once you have scheduled a set of promise executions, you may want to wait for the completion of all the promises scheduled in the pool. To do so, you can use the .all
API which has the same semantics as Promise.all
.
for (let i = 0; i < 100; ++i) {
pool.schedule(promise(i));
}
// Waiting for all scheduled promises to be executed,
// and prints the result to the standard output.
pool.all().then(console.log);
Note that the .all
method will by default forward to the end callback an array of results yield by all the promises in execution at the time .all
has been called, in the same order as they were enqueued in the pool.
The .resize()
API
It is possible to dynamically resize the promise pool in order to tune its performance at runtime. Take for instance the case in which you notice that the scheduled actions becomes too heavy for your promise pool, enlarging your pool to increase its number of executors on demand can provide better behaviors in the face of burst or high nunber of scheduled operations.
/**
* Example algorithm in which we resize the pool
* if the actions to schedule are superior to twice
* the current size of the pool.
*/
while (actionsToSchedule.length > 0) {
pool.schedule(actionsToSchedule.pop());
if (actionsToSchedule.length > pool.size() * 2) {
pool.resize(pool.size() * 2);
}
}
Delaying promise executions
While scheduling or enqueuing promises, it is possible to delay their execution in time. To do so, you can provide a delay in milliseconds as a second argument of .schedule
, .enqueue
, .enqueueMany
and enqueueOnSameExecutor
to specify how much time should last before executing the given promise.
Note that this delay will not specify the amount of time between each promise execution since all promises do not execute sequentially, but rather the delay before executing them. For instance, in a pool of 5 executors and given 5 promises scheduled with a delay of 1 second, the 5 promises will all be executed in parallel after a period of 1 second since they are all going to be executed in parallel.
for (let i = 0; i < 5; ++i) {
pool.schedule(promise(i), 1000);
}
// Resolves after 1 second.
pool.all().then(console.log);
If you'd like to enforce a delay between each promise execution, you need to enqueue them sequentially with enqueueOnSameExecutor
, such as in the below example.
// Will resolve after 3 seconds (1 second for each promise execution).
pool.enqueueOnSameExecutor([ promise(1), promise(2), promise(3) ], 1000).then(console.log);
Lifecycle events
The promise pool implements the event-emitter
interface allowing clients of this library to register handlers to the following lifecycle events:
before.each
is emitted before the execution of each scheduled promise.after.each
is emitted after the execution of each scheduled promise.before.enqueue.each
is emitted before each scheduled promise is enqueued in the pool.after.enqueue.each
is emitted after each scheduled promise is enqueued in the pool.pool.resized
is emitted after a resize operation on the promise pool.
/**
* Registering lifecycle events on the pool.
*/
pool.on('before.enqueue.each', (e) => {
console.log(`Enqueued promise on executor ${e.idx}`);
}).on('before.each', (e) => {
console.log(`About to execute promise with executor (${e.idx})`);
}).on('after.each', (e) => {
console.log(`Executed promise with executor (${e.idx}) and result ${JSON.stringify(e.result)}`);
}).on('pool.resized', (e) => {
console.log(`Resized pool to size ${e.size}`);
});
For a more complete example on lifecycle events, see the
lifecycle-events
example.
Patching the Promise
object
For commodity, it is possible to patch the existing Promise
function with the Pool
object for further use within your application.
// Patch the global `Promise` object.
Pool.patch(Promise);
const pool = new Promise.Pool(5);
Note that the
patch
method will not modify thePromise
object if an existingPool
object already exists. THepatch
method returns a reference to the patchedPool
object, or an undefined value if the patching operation failed.
Examples
Different examples demonstrating the usage of the promise pool in different contexts are available in the examples directory.
Two graphical examples providing a way to monitor the pool in the form of a live graph are available under pool-monitoring and pool-resize.
See Also
- Pool watch, a live chart renderer of the distribution of promises across a promise pool.
- The pool-monitoring example.