npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

callback-batcher

v1.0.2

Published

Isomorphic callback rate-limiter and batcher

Downloads

1

Readme

Callback Batcher

PRs Welcome License: MIT

Browser Demo 👀

Built at @Tubitv

Introduction

This library provides a simple tool for in-memory rate-limiting of function invocation in both the browser and in node. Key features:

  • The developer can choose between rate-limiting algorithms with different "burstiness" characteristics
  • The next call after a series of throttled calls is passed a count of previously throttled calls, which is useful for situations in which the developer needs to track occurrences of a certain event without impacting each occurence creating load on a scarce resource
  • Callback functions passed to the batcher utility can be assigned an identifier string, allowing a single batcher instance to separately rate-limit many distinct kinds or categories of event

This tool is ideal for situations where--for example--a client application issues repeating log, error, or analytics events which must all be tracked, but where there is also a need to minimize load on a logging or tracking service.

Installation

npm i callback-batcher

This package has no runtime dependencies.

Basic Usage

The makeCallbackBatcher function, given a configuration, returns a batcher object:

const batcher = makeCallbackBatcher({
  maxTokens: 3,
  tokenRate: 1000,
});

This object exposes two functions. The first, schedule, is used to request invocation of a callback, subject to a rate limit:

const batcher = makeCallbackBatcher({
  maxTokens: 3,
  tokenRate: 1000,
});
batcher.schedule(() => console.log('invoked!')) > 'invoked!';

If we schedule callback invocations more quickly than the rate limit allows, some calls will be throttled, invoked only after some time. When invoked, the callbacks will be passed a count telling us how many calls were throttled in between the last sucessful scheduled invocation and the current one:

const batcher = makeCallbackBatcher({
  maxTokens: 3,
  tokenRate: 1000
});

for (let i = 0; i < 6; i += 1) {
  batcher.schedule((c) => console.log('invoked! count: ${c}'))
}
> 'invoked! count: 1'
> 'invoked! count: 1'
> 'invoked! count: 1'
then after 1000ms...
> 'invoked! count: 3'

The second function on the batcher object, disposer, cleans up the timers used internally by the batcher, immediately making any trailing, batched invocations:

const batcher = makeCallbackBatcher({
  maxTokens: 3,
  tokenRate: 1000,
});

for (let i = 0; i < 6; i += 1) {
  batcher.schedule((c) => console.log('invoked! count: ${c}'));
}
batcher.dispose() >
  'invoked! count: 1' >
  'invoked! count: 1' >
  'invoked! count: 1' >
  'invoked! count: 3';

Keeping State Per-Callback

The scheduler function on the batcher also accepts a second optional string argument. Called a "callback identifier hash", this value is meant to uniquely identify a callback to the batcher. This allows the batcher to separately track state for different callbacks.

const batcher = makeCallbackBatcher({
  maxTokens: 3,
  tokenRate: 1000,
});

// max out `callback-a`
for (let i = 0; i < 3; i += 1) {
  batcher.schedule((count) => console.log(`A ${count}`), 'callback-a');
}

// callback b will still be invoked as soon as it is scheduled
batcher.schedule((count) => console.log(`B ${count}`), 'callback-b') >
  'A 1' >
  'A 1' >
  'A 1' >
  'B 1';

Strategies

The makeCallbackBatcher factory function can optionally be passed a strategy argument that specifies which algorithm to use for rate limiting. The various algorithms have their own configuration parameters.

The Leaky Bucket strategy assignes each callback a token bucket to which tokens are added at tokenRate (in milliseconds), with an initial and maximum value of maxTokens. Under callback invocation pressure, this strategy allows initial bursts of size maxTokens, with a trickle of batched calls following determined by tokenRate.

const batcher = makeCallbackBatcher({
  strategy: 'LEAKY_BUCKET',
  maxTokens: 3,
  tokenRate: 1000,
});

The Windowed Rate Limiter strategy keeps track of when invocations were previously requested for each callback identifier hash over a window of length windowSize (in milliseconds). If invoked more than callsPerWindow during this backwards-looking window, the callback is throttled. Batch invoations take place when calls slow down, or every windowSize milliseconds (to ensure that tailing calls are captured when there are no more invocations).

const batcher = makeCallbackBatcher({
  strategy: 'WINDOWED_RATE_LIMITER',
  windowSize: 1000,
  callsPerWindow: 2,
});

When not passed a strategy argument, makeCallbackBatcher by default selects the Leaky Bucket strategy.

This in-browser demo vizualizes the varying timing charactaristics of the strategies.

Caveats

The rate-limiting strategies employed by this library make use of setInterval internally. Many browser environments choose to throttle timers on inactive or hidden pages. This means that the rate of callback invocation may drop substantially below the configured rate.

However, if a Leaky Bucket batcher configured to refill buckets every second is throttled by the browser to instead add tokens only every ten seconds, the count value passed to the callback will be correct, ensuring that while the rate of callback invocation may slow, invocations can be tracked.

In a browser context, it may make sense to invoke the disposer cleanup function on the batcher in a beforeunload event. In addition to cleaning up setIntervals used internally, disposers make any tailing batched calls, ensuring that throttled callbacks are not "missed".

Development

To make change to this library, clone it, run yarn install, and then yarn run start. This will launch the in-browser demo, which is useful for verifying that changes have their intended effects.

Unit tests (written using vitest) can be run via yarn test.