npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

rate-limiter-flexible

v5.0.4

Published

Node.js rate limiter by key and protection from DDoS and Brute-Force attacks in process Memory, Redis, MongoDb, Memcached, MySQL, PostgreSQL, Cluster or PM

Downloads

2,939,852

Readme

npm version npm node version deno version

node-rate-limiter-flexible

rate-limiter-flexible counts and limits the number of actions by key and protects from DDoS and brute force attacks at any scale.

It works with Redis, Prisma, DynamoDB, process Memory, Cluster or PM2, Memcached, MongoDB, MySQL, and PostgreSQL.

Memory limiter also works in the browser.

Atomic increments. All operations in memory or distributed environment use atomic increments against race conditions.

Fast. Average request takes 0.7ms in Cluster and 2.5ms in Distributed application. See benchmarks.

Flexible. Combine limiters, block key for some duration, delay actions, manage failover with insurance options, configure smart key blocking in memory and many others.

Ready for growth. It provides a unified API for all limiters. Whenever your application grows, it is ready. Prepare your limiters in minutes.

Friendly. No matter which node package you prefer: redis or ioredis, sequelize/typeorm or knex, memcached, native driver or mongoose. It works with all of them.

In-memory blocks. Avoid extra requests to store with inMemoryBlockOnConsumed.

Allow traffic bursts with BurstyRateLimiter.

Deno compatible See this example

It uses a fixed window, as it is much faster than a rolling window. See comparative benchmarks with other libraries here

Installation

npm i --save rate-limiter-flexible

yarn add rate-limiter-flexible

Import

// CommonJS
const { RateLimiterMemory } = require("rate-limiter-flexible");

// or

// ECMAScript 
import { RateLimiterMemory } from "rate-limiter-flexible";
// or
import RateLimiterMemory from "rate-limiter-flexible/lib/RateLimiterMemory.js";

Basic Example

Points can be consumed by IP address, user ID, authorisation token, API route or any other string.

const opts = {
  points: 6, // 6 points
  duration: 1, // Per second
};

const rateLimiter = new RateLimiterMemory(opts);

rateLimiter.consume(remoteAddress, 2) // consume 2 points
    .then((rateLimiterRes) => {
      // 2 points consumed
    })
    .catch((rateLimiterRes) => {
      // Not enough points to consume
    });

RateLimiterRes object

The Promise's resolve and reject callbacks both return an instance of the RateLimiterRes class if there is no error. Object attributes:

RateLimiterRes = {
    msBeforeNext: 250, // Number of milliseconds before next action can be done
    remainingPoints: 0, // Number of remaining points in current duration 
    consumedPoints: 5, // Number of consumed points in current duration 
    isFirstInDuration: false, // action is first in current duration 
}

You may want to set HTTP headers for the response:

const headers = {
  "Retry-After": rateLimiterRes.msBeforeNext / 1000,
  "X-RateLimit-Limit": opts.points,
  "X-RateLimit-Remaining": rateLimiterRes.remainingPoints,
  "X-RateLimit-Reset": new Date(Date.now() + rateLimiterRes.msBeforeNext)
}

Advantages:

Full documentation is on Wiki

Middlewares, plugins and other packages

Some copy/paste examples on Wiki:

Migration from other packages

  • express-brute Bonus: race conditions fixed, prod deps removed
  • limiter Bonus: multi-server support, respects queue order, native promises

Docs and Examples

Changelog

See releases for detailed changelog.

Basic Options

  • points

    Default: 4

    Maximum number of points that can be consumed over duration

  • duration

    Default: 1

    Number of seconds before consumed points are reset.

    Points are never reset if duration is set to 0.

  • storeClient

    Required for store limiters

    Must be redis, ioredis, memcached, mongodb, pg, mysql2, mysql or any other related pool or connection.

Other options on Wiki:

Smooth out traffic peaks:

Specific:

API

Read detailed description on Wiki.

Benchmark

Average latency during test of pure NodeJS endpoint in cluster of 4 workers with everything set up on one server.

1000 concurrent clients with maximum 2000 requests per sec during 30 seconds.

1. Memory     0.34 ms
2. Cluster    0.69 ms
3. Redis      2.45 ms
4. Memcached  3.89 ms
5. Mongo      4.75 ms

500 concurrent clients with maximum 1000 req per sec during 30 seconds

6. PostgreSQL 7.48 ms (with connection pool max 100)
7. MySQL     14.59 ms (with connection pool 100)

Note, you can speed up limiters with inMemoryBlockOnConsumed option.

Contributions

Appreciated, feel free!

Make sure you've launched npm run eslint before creating PR, all errors have to be fixed.

You can try to run npm run eslint-fix to fix some issues.

Any new limiter with storage must be extended from RateLimiterStoreAbstract. It has to implement 4 methods:

  • _getRateLimiterRes parses raw data from store to RateLimiterRes object.

  • _upsert may be atomic or non-atomic upsert (increment). It inserts or updates the value by key and returns raw data. If it doesn't make an atomic upsert (increment), the class should be suffixed with NonAtomic, e.g. RateLimiterRedisNonAtomic.

    It must support forceExpire mode to overwrite key expiration time.

  • _get returns raw data by key or null if there is no key.

  • _delete deletes all key-related data and returns true on deleted, false if key is not found.

All other methods depends on the store. See RateLimiterRedis or RateLimiterPostgres for examples.

Note: all changes should be covered by tests.