npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

node-limited

v1.1.1

Published

Automatic delays and smart query rewriting to deal with Beam's rate limits, in one line of code.

Downloads

3

Readme

limited

Build Status Coverage Status

As of the time of writing, Beam announced that in just 1 week, they will enable some very strict rate limits for the majority of API endpoints, which essentially means that if a specific IP address hits any one of a group of endpoints (with the same rate limit "bucket") too many times, the request will be blocked until your number of requests resets. This is to ensure that people don't needlessly hit the API too hard.

Unfortunately, this poses a problem for StreamJar because we actively monitor, as of time of writing, almost 2,000 Beam channels. We do need to pull quite a bit of data from the livestreaming platforms we integrate with, from every channel, and so rate limits are a worry for us.

You can find a full list of all Beam's rate limits on their Developer Docs.

Beam uses yaral and limitus to do their rate limiting.

Our current system would send a HTTP request, see the bad Too Many Requests status code, complain to us, and then re-attempt whatever it was doing a few minutes later. I imagine it'll get pretty bad with so many channels. Continually retrying blocks of code until we don't get a 429 status code isn't sustainable, and is likely to carry lots of bugs or weird delays.

Given the very short notice of 1 week provided, we needed a quick, universal solution that we could drop in to 3 different projects to help us when rate limits are switched on.

So, Limited was born.

What it is

In short, it's designed to be the easiest way to handle Beam's rate limiting. Every time you make a HTTP request, it will listen for Beam's rate limiting headers. If your HTTP request is going to get hit by the rate limit, the request will automatically stall. Requests will queue up, one after the other, once the rate limit is lifted.

As an example, lets say we were querying many endpoints that all fell under the same rate limiting "bucket". That imaginary bucket allows for 100 queries in 60 seconds. So, let's say we make 200 normal HTTP requests to these endpoints through Node. The first 100 requests will be executed instantly, as normal. The other 100 will automatically wait until the rate limit resets. When it does reset, requests will be executed, one after the other, with 1 second between each one.

Where possible, the library will also automatically make your queries more efficient. For example, if you queried /channels/StreamJar and /channels/BlipBot within the same second, only one request would be sent to /channels?limit=100&where=token.eq.StreamJar;BlipBot, invisibly to your application. This means that you will actually hit your rate limit more slowly, because you will be using fewer requests.

Installation

npm install --save node-limited

Usage

It requires absolutely no changes to your code.

Okay, I lied, there is one line. Put this at the top.

require('node-limited')();

That's it! It works no matter how you're making your API requests, whether it's through the standard "http" module, through "request", or Beam's "beam-client-node" module. It hooks into all of it.

However, remember that the library keeps track of requests in order to predict when it's about to get rate limited. If we're running multiple instances from the same outbound IP address, that might be a problem, so we can use Redis to store the data instead.

require('node-limited')({ redis: { host: '127.0.0.1', port: 6379 } });

The library uses ioredis for your connection. That means that any options you can provide to ioredis can also be provided in the redis object above. Alternatively, if your project also uses ioredis too, you can just pass your connection instead.

const pub = new Redis({ host: '127.0.0.1', port: 6379 });
const sub = new Redis({ host: '127.0.0.1', port: 6379 });
require('node-limited')({ redis: { pub, sub } });

Advanced usage

If you would like to log what Limited is doing, you can hook into its events.

const limited = require('node-limited')();
limited.on('request', result => {
	console.log(`Query to ${result.bucket} bucket delayed by ${result.wait}ms.`);
});
limited.on('rewritten', result => {
	console.log(`Rewritten query to ${result.path}.`);
});