npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

promise-shortly

v2.0.0

Published

Return a promise to resolve "shortly", based on rate limiting and a priority queue

Downloads

18

Readme

promise-shortly

Typescript utility library to create promises that resolve according to a rate limit.

Install:

$ npm install --save promise-shortly

Example:

import { Shortly } from "promise-shortly";

const wait = new Shortly(
  {
    capacity: 10,
    fillQuantity: 1,
    fillTime: 1000,
    initialCapacity: 1,
  },
  {
    limit: 2,
  }
).wait;

const start = Date.now();
const numSeconds = () => Math.round((Date.now() - start) / 1000);

wait().then(() => {
  console.log("default, completed after " + numSeconds() + "s");
});

wait({ priority: 0 }).catch(() => {
  console.log("low priority, rejected (over limit)");
});

wait({ priority: 5, tokens: 3 }).then(() => {
  console.log("middle priority, completed after " + numSeconds() + "s");
});

wait({ priority: 10, tokens: 1 }).then(() => {
  console.log("high priority, completed after " + numSeconds() + "s");
});

Output:

default, completed after 0s
low priority, rejected (over limit)
high priority, completed after 1s
middle priority, completed after 4s

API

Constructor

new Shortly(tokenBucketOptions[, shortlyOptions])

tokenBucketOptions

Options for the token bucket are passed directly to simple-token-bucket#options, so this document is not authoritative, but I will list the current options here for convenience:

  • capacity: the capacity of the token bucket, aka burstiness
  • fillQuantity: how many tokens to add when filling
  • fillTime: how much time it takes to add fillQuantity tokens
  • initialCapacity: the bucket initializes to max capacity by default, but you can optionally change it here

fillQuantity and fillTime combined create a rate which is used to calculate both how many tokens to add at any given moment and how much time remains before a request can be fulfilled. I chose this approach since most of the time it's desirable to specify a rate limit in "X's per Y".

shortlyOptions

  • limit: the maximum number of items to enqueue. If the limit is exceeded, low priority items will be rejected with a BucketOverflowError.

#wait

shortly.wait({priority: number, tokens: number}) Arguments are optional. Requests are sorted by priority first and tokens second. High priority values trump low priority values, while low token counts trump high token counts.

What

promise-shortly is a promise-based rate limiter with prioritization and a simple API. You set it up, then any time you want to wait on the rate limit, just call wait(). It allows for prioritization so that certain promises can jump the queue, and you may specify the "weight" of a request in tokens, which has two effects:

  1. Requests with fewer tokens will be resolved first in the same priority class
  2. Requests specifying other than the default 1 token will cause that amount of tokens to be removed from the backing token bucket implementation; in effect, a request of 3 tokens will take 3 times as long to recover from than a request of 1 token.

How

promise-shortly utilizes simple-token-bucket and @datastructures-js/priority-queue and ties them together to provide a convenient API. The token bucket guides whether a request can be satisfied; if it cannot, a timeout is utilized to resolve it at the first opportunity. New requests may alter this, of course.

Why

promise-shortly provides a simple API that doesn't rely on coupling its implementation with yours. Anywhere you can resolve a promise, you can delay based on a rate limit.