npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

req-locker

v1.0.8

Published

Express middleware for caching and locking requests

Downloads

391

Readme

req-locker

Express middleware for caching and locking requests

It includes 2 libraries: cache and locker.

Russian ReadMe

Installation

npm i req-locker

cache

The usual caching of requests in the server's RAM

The simplest example:

const { cache } = require('req-locker');

app.use(cache({
    overrideSend: true
}));

Configuration:

The cache(params) function accepts a single object with parameters:

  • ttl=60 - cache lifetime in seconds
  • checkperiod=5 - the cache check interval in seconds
  • cacheKey - the function of determining the cache key by request, by default, an MD5 hash is taken by the url of the request, including all its query and body, but the user can specify any text key, for example, cache only one query parameter
  • statusCode=200 - which HTTP code should be used to respond if the cached value is returned
  • overrideSend=false - if this parameter is true, the library will overwrite the standard res.send() method to save the response to the request in the cache. If the value is false, then to save the cache, you need to return a response to the request using the new method res.cachedSend()

locker

This middleware "holds" the retry requests until the original request returns a response.

What is it for?

Imagine a situation where you have a highly loaded request on a server that calculates, for example, tariffs for a service, it takes into account many parameters and, in the usual version, it responds in 2 seconds. At peak load times, the response increases significantly, for example, up to 8-10 seconds. If the request needs to be waited for so long, then users may perceive it as an error but try to call the request again (it all depends on the implementation of the client side) or external integration can call your method and retry the request with a large timeout. In this case, the already overloaded service receives additional retry requests that cause even more load (in this case, the cache will not help, since the "first" request has not yet received data for caching). It turns out to be a snowball effect, the longer the server responds, the more retries it receives and even more heavily loaded up to failure.

locker is a simple mechanism that simply finds retry requests by key (similar to the cache key) and holds their connection until the original request receives data for a response. At this point, the server will respond to both the original request and all the retreats at the same time with the same response. At the same time, retry requests ** will not** unnecessarily load the server, preventing the snowball effect.

If you encounter a similar problem during peak load times, then locker may be able to help you.

The simplest example:

const { locker } = require('req-locker');

app.use(locker());

Configuration:

The locker(params) function accepts a single object with parameters:

  • reqKey - function of determining the request key, similar to cacheKey
  • statusCode=200

ToDo:

  • Improve documentation and examples
  • Write unit tests
  • Add methods for collecting statistics