npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@beesley/bags-of-cache

v1.0.24

Published

TTL based in memory cacheing util

Downloads

43

Readme

@beesley/bags-of-cache

Build, Test & Publish Main Commitizen friendly semantic-release: angular

This package contains in memory cache tools. A generic in memory cache class is available to be used by other tools.

Cached Client

This is the base class used by the other classes in this package. It provides the ttl cache implementation and some basic utilities. In general this class can be used whenever there is a need to make repeated calls to the same expensive function by leveraging an in memory cache with a ttl.

Caveats

To ensure items in the in memory cache are immutable they are stored in a serialised format, so when an item is saved it is serialized, and each time it is read it is deserialized. The overhead for this is small, we use v8's serialization methods for this, but bear this in mind when weighing up whether or not cacheing will improve performance.

Usage

import { CacheingClient } from '@beesley/bags-of-cache';

const client = new CacheingClient({
  ttl: 60e3, // cache responses for 60 seconds
});

// cache something
const fooValue = 'foo';
client.set('foo', fooValue);

// read from the cache
client.get('foo')

// clear the cache 
client.stop();

// memoise an arbitrary async function, leveraging the in memory cache
const fn = async (arg, other) => `arg was ${arg}, other was ${other}`;
const memoised = client.memoise(fn);
const first = await memoised('one');
const second = await memoised('one'); // this will be returned from the cache

Client API

CacheingClient

src/cacheing-client.ts:16-108

Base client used to create clients with in memory caches

get

src/cacheing-client.ts:46-49

Gets an item in the cache

Parameters

Returns any {*}

memoise

src/cacheing-client.ts:59-72

Memoises an arbitrary function using the cache

Parameters
  • fn T Any async function

Returns any {T} Memoised version of the function

set

src/cacheing-client.ts:82-86

Sets an item in the cache

Parameters
  • key string The cache key
  • value any The thing to be cached
  • customTtl number Override the default ttl

Returns void

stop

src/cacheing-client.ts:93-95

Empties the cache

Returns void

createCacheKey

src/cacheing-client.ts:105-107

Util to serialise stuff to use as a cache key

Parameters
  • args ...Array<any> Arguments to pass to serialise

Returns any {string}

Cached Dynamo Client

Where repeated calls are made to dynamo for the same content, this cacheing client can be used to reduce the number of requests sent by cacheing the responses. For some access patterns, the same query is often sent again and again, even when it's response changes very infrequently. This can lead to unnecessarily hammering dynamo with the same requests as well as being unnecessarily slow due to these requests. Under those conditions this module can significantly improve performance by cacheing those dynamo responses for a provided ttl.

When sending a query or get command, the client will initially check it's in memory cache for a result before calling dynamo. If the response is in the cache, it is served from the cache and we don't call dynamo. If the response is not in the cache, after calling dynamo it is stored in the cache for the provided ttl.

It is also possible to individually limit the concurrency of query and get commands to dynamo. This can be used to generally throttle the rate at which dynamo requests are made. A particularly useful case for this though is when you know your application is going to make the same request repeatedly and the queryConcurrency is set to 1, the first of those queries will go to dynamo and any more requests for the same query will be paused until that first response comes back. Each of those paused queries will be resolved with the cached value from that initial request as soon as that first response comes back (basically dogpile protection).

Caveats

For situations where the same request for config is repeatedly made, this module should dramatically improve performance. However, to ensure items in the in memory cache are immutable they are stored in a serialised format, so when an item is saved it is serialized, and each time it is read it is deserialized. The overhead for this is small, we use v8's serialization methods for this, but if your access pattern is such that the same queries are unlikely to be repeated, the only impact from using this library would be an increase in memory use and slower responses (due to serializing each item that gets cached). So consider your access pattern and decide for yourself whether this is the right solution.

Usage

import { DynamoCachedClient } from '@beesley/bags-of-cache/dynamo';

const client = new DynamoCachedClient({
  ttl: 60e3, // cache responses for 60 seconds
  dynamoTable: 'my-config-dev',
  awsRegion: 'eu-central-1',
  queryConcurrency: 1, // only allow 1 concurrent dynamo query command - optional
  getConcurrency: 1, // only allow 1 concurrent dynamo getItem command - optional
});

// send an arbitrary query, leveraging the in memory cache
const configItems = await client.query({
  TableName: 'big-prod-table',
  KeyConditionExpression: '#type = :type',
  FilterExpression: '#country = :country AND #language = :language',
  ExpressionAttributeNames: {
    '#type': 'type',
    '#country': 'country',
    '#language': 'language',
  },
  ExpressionAttributeValues: {
    ':configType': 'track',
    ':country': 'de',
    ':language': 'default',
  },
});

// get an arbitrary item, leveraging the in memory cache
const item = await client.getItem({
  country: 'de',
  pk: 'foo',
});

Dynamo API

emptyResponseCacheTime

src/dynamo-client.ts:22-22

When we get no items back from a query we will not retry the query within this time (ms)

Type: number

queryConcurrency

src/dynamo-client.ts:29-29

Limit concurrently sent queries to this value

Type: number

getConcurrency

src/dynamo-client.ts:36-36

Limit concurrent getItem calls to this value

Type: number

DynamoCachedClient

src/dynamo-client.ts:46-172

Extends CacheingClient

Client for dynamo tables. Makes dynamo requests and caches the results.

getItem

src/dynamo-client.ts:81-108

Gets a single item from the table by key and caches the result

Parameters
  • key Record<string, any> The dynamo key object

Returns any {(Promise<T | undefined>)}

query

src/dynamo-client.ts:118-132

Sends an arbitrary dynamo query, cacheing the results

Parameters
  • input Partial<QueryCommandInput> The dynamo query command input

Returns any {Promise<T[]>}