npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

mastercache

v1.0.1-beta.1.1

Published

Multi-tier cache module for Node.js. Redis, Upstash, CloudfareKV, File, in-memory and others drivers

Downloads

310

Readme

image

Features

  • 🗄️ Multi-tier caching
  • 🔄 Synchronization of local cache via Bus
  • 🚀 Many drivers (Redis, Upstash, In-memory, Postgres, Sqlite and others)
  • 🛡️ Grace period and timeouts. Serve stale data when the store is dead or slow
  • 🔄 Early refresh. Refresh cached value before needing to serve it
  • 🗂️ Namespaces. Group your keys by categories.
  • 🛑 Cache stamped protection.
  • 🏷️ Named caches
  • 📖 Well documented + handy JSDoc annotations
  • 📊 Events. Useful for monitoring and metrics
  • 📝 Easy Prometheus integration and ready-to-use Grafana dashboard
  • 🧩 Easily extendable with your own driver

See documentation at mastercache.dev

Why Mastercache ?

There are already caching libraries for Node: keyv, cache-manager, or unstorage. However, I think that we could rather consider these libraries as bridges that allow different stores to be used via a unified API, rather than true caching solutions as such.

Not to knock them, on the contrary, they have their use cases and cool. Some are even "marketed" as such and are still very handy for simple caching system.

Mastercache, on the other hand, is a full-featured caching solution. We indeed have this notion of unified access to differents drivers, but in addition to that, we have a ton of features that will allow you to do robust caching.

With that in mind, then I believe there is no serious alternative to Mastercache in the JavaScript ecosystem. Which is regrettable, because all other languages have powerful solutions. This is why Mastercache was created.

Quick presentation

Mastercache is a caching solution aimed at combining performance and flexibility. If you are looking for a caching system that can transition from basic use to advanced multi-level configuration, you are in the right place. Here's what you need to know :

One-level

The one-level mode is a standard caching method. Choose from a variety of drivers such as Redis, In-Memory, Filesystem, DynamoDB, and more, and you're ready to go.

In addition to this, you benefit from many features that allow you to efficiently manage your cache, such as cache stampede protection, grace periods, timeouts, namespaces, etc.

Two-levels

For those looking to go further, you can use the two-levels caching system. Here's basically how it works:

  • L1: Local Cache: First level cache. Data is stored in memory with an LRU algorithm for quick access
  • L2: Distributed Cache: If the data is not in the in-memory cache, it is searched in the distributed cache (Redis, for example)
  • Synchronization via Bus: In a multi-instance context, you can synchronize different local in-memory caches of your instances via a Bus like Redis or RabbitMQ. This method maintains cache integrity across multiple instances

Here is a simplified diagram of the flow :

Mastercache Flow

All of this is managed invisibly for you via Mastercache. The only thing to do is to set up a bus in your infrastructure. But if you need multi-level cache, you're probably already using Redis rather than your database as a distributed cache. So you can leverage it to synchronize your local caches

The major benefit of multi-tier caching, is that it allows for responses between 2,000x and 5,000x faster. While Redis is fast, accessing RAM is REALLY MUCH faster.

In fact, it's a quite common pattern : to quote an example, it's what Stackoverflow does.

To give some perspective, here's a simple benchmark that shows the difference between a simple distributed cache ( using Redis ) vs a multi-tier cache ( using Redis + In-memory cache ) :

Redis vs Multi-tier caching

Features

Below is a list of the main features of MasterCache. If you want to know more, you can read each associated documentation page.

Multi layer caching

Multi-layer caching allows you to combine the speed of in-memory caching with the persistence of a distributed cache. Best of both worlds.

Lot of drivers

Many drivers available to suit all situations: Redis, Upstash, Database (MySQL, SQLite, PostgreSQL), DynamoDB, Filesystem, In-memory (LRU Cache), Vercel KV...

See the drivers documentation for list of available drivers. Also very easy to extend the library and add your own driver

Resiliency

  • Grace period: Keep your application running smoothly with the ability to temporarily use expired cache entries when your database is down, or when a factory is failing.

  • Cache stamped prevention: Ensuring that only one factory is executed at the same time.

  • Retry queue : When a application fails to publish something to the bus, it is added to a queue and retried later.

Timeouts

If your factory is taking too long to execute, you can just return a little bit of stale data while keeping the factory running in the background. Next time the entry is requested, it will be already computed and served immediately.

Namespaces

The ability to create logical groups for cache keys together, so you can invalidate everything at once later :

const users = cachemanager.namespace('users')

users.set('32', { name: 'foo' })
users.set('33', { name: 'bar' })

users.clear() 

Events

Events are emitted by Mastercache throughout its execution, allowing you to collect metrics and monitor your cache.

cachemanager.on('cache:hit', () => {})
cachemanager.on('cache:miss', () => {})
// ...

See the events documentation for more information.

Friendly TTLs

All TTLs can be passed in a human-readable string format. We use lukeed/ms under the hood. (this is optional, and you can pass a number in milliseconds if you prefer)

cachemanager.getOrSet('foo', () => getFromDb(), {
  ttl: '2.5h'
  gracePeriod: { enabled: true, duration: '6h' }
})

Early refresh

When you cached item will expire soon, you can refresh it in advance, in the background. This way, next time the entry is requested, it will already be computed and thus returned to the user super quickly.

cachemanager.getOrSet('foo', () => getFromDb(), {
  earlyExpiration: 0.8
})

In this case, when only 20% or less of the TTL remains and the entry is requested :

  • It will returns the cached value to the user.
  • Start a background refresh by calling the factory.
  • Next time the entry is requested, it will be already computed, and can be returned immediately.

Logging

You can pass a logger to Mastercache, and it will log everything that happens. Can be useful for debugging or monitoring.

import { pino } from 'pino'

const cachemanager = new MasterCache({
  logger: pino()
})

See the logging documentation for more information.

Sponsor

If you like this project, please consider supporting it by sponsoring it. It will help a lot to maintain and improve it. Thanks a lot !