npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@ekwoka/weak-lru-cache

v0.1.1

Published

A simple weak referenced Least Recently Used cache

Downloads

10

Readme

A simple Weak Reference Least Recently Used Cache

This cache allows for items to be stored and maintained while in use, but also allowing for unused items to be garbage collected after they have been sufficiently pushed out by more recently used items.

Installation

pnpm add @ekwoka/weak-lru-cache

Usage

import WeakLRUCache from '@ekwoka/weak-lru-cache';
// or
import { WeakLRUCache } from '@ekwoka/weak-lru-cache';

const cache = WeakLRUCache<CrazyObjectType>();

cache.set('key', crazyObject);
cache.get('key'); // => crazyObject

What is a Weak LRU Cache?

A Weak LRU Cache is a cache based on the 'Least Recently Used' caching strategy, where cached items are put into a queue, and once they progress out of the queue (like having more items than the cache allows), they are discarged. However, if the item is accessed again while in the cache, it is moved back to the start of the queue, where it can earn some more time to live.

Where the Weak LRU differs is that it uses Weak References to hold items after they have exited the queue. Weak References allow the object to be accessed if needed, but also allow the normal garbage collection system to clean them up if they've become truly unreferenced. This means that if you application is still using the item, even if it has become very old, it will still exist in the cache to be reaccessed by other parts of the application, but once the rest of the application is done with it, it will likely be garbage collected.

API

The goal is to have the API be nearly identical to that of the native Map object, except with the different constructor signature, as well as some unique methods that fit with the goal of the cache.

Constructor

WeakLRUCache<T>(options?: WeakLRUCacheOptions)

This function creates the cache object. It accepts the optional argument of cache options, as well a generic type that types the items stored within the cache. The options are as follows:

WeakLRUCacheOptions

  • size?: number - The maximum size of the Cache. Defaults to 1000
  • maxAge?: number - The maximum age (in seconds) of an item before it is evicted and replaced with a WeakRef.
  • getSize?: (T) => number - A function for evaluating the size of an item. By default every item is 1 (making the size a length of sorts). With this you can use methods to evaluate the actual memory size of the object to make the cache memory limited. Function will be run on every item when added to the cache, and will be passed the item in question.

Properties

.size: number

This property returns the number of items in the cache. This is not the same as the size limit, as it will include items that are still being used, but have not yet been garbage collected.

This property is evaluated lazily, and can return a number higher than the actual number of cached items, since it is possible for cache items to be garbage collected without the cache being aware of it. Most methods accessing data will verify data has not been garbage collected when retrieving data, but this property will not.

cache.size; // => 1

Methods

.set(key: string, value: T): WeakLRUCache<T>

Rather straightforward, this method sets the value of the key in the cache. If the key already exists, it will be overwritten, and moved to the front of the queue. If this causes the cache to exceed the size limit, the oldest item will be replaced with a WeakRef to the value and await cleanup.

cache.set('key', { foo: 'bar' });

.get(key: string): T | undefined

This method returns the value of the key in the cache, if it exists. If the value is a WeakRef, it will be dereferenced and returned. If the value is not found, undefined will be returned. If they key is found, it will be moved to the front of the queue. If the value was await garbage collection (from being outside the size of the cache queue), it will be strong referenced and moved back to the front of the queue.

cache.get('key'); // => { foo: 'bar' }

.peek(key: string): T | undefined

This is a unique method for this cache that does not exist on the Map. This is very similar to the .get method, except that it will not move the key to the front of the queue or strongly reference any accessed data. This likely has little benefit in most cases, but does allow you to retrieve a value from the cache without affecting the cache's internal state.

cache.peek('key'); // => { foo: 'bar' }

.has(key: string): boolean

This method returns a boolean indicating if the key exists in the cache. This will not move the key to the front of the queue, or strongly reference any accessed data. One factor of note is that it is possible for this method to return true even if the value has been garbage collected, as this is evaluated lazily.

cache.has('key'); // => true

.delete(key: string): boolean

This method deletes the key from the cache, regardless of it's position in the queue or reference status. This will return a boolean indicating if the key was found and deleted.

cache.delete('key'); // => true

.clear(): void

This method clears the cache, removing all items from the queue and expiring all references.

cache.clear();

.keys(): IterableIterator<string>

This method returns an iterator of all the keys in the cache. This will not move the keys to the front of the queue, or otherwise prevent data from being garbage collected. This will clean the internal list of items as you iterate, ensuring only valid keys are returned.

for (const key of cache.keys()) console.log(key);

.values(): IterableIterator<T>

This method returns an iterator of all the values in the cache. This will not move the keys to the front of the queue, or otherwise prevent data from being garbage collected if you don't hold onto the data itself. This will clean the internal list of items as you iterate, ensuring only valid values are returned.

console.log([...cache.values()]);

.entries(): IterableIterator<[string, T]>

This method returns an iterator of all the entries in the cache. This will not move the keys to the front of the queue, or otherwise prevent data from being garbage collected if you don't hold onto the data itself. This will clean the internal list of items as you iterate, ensuring only valid entries are returned.

for (const [key, value] of cache.entries()) console.log(key, value);

[Symbol.iterator](): IterableIterator<[string, T]>

You can also directly iterate over the cache, or pass its contents to things that accept iterables, which is the same as calling .entries().

new Map(cache);

.forEach(callback: (value: T, key: string, cache: WeakLRUCache<T>) => void, thisArg?: any): void

This method allows you to iterate over the cache, and perform an action on each item. This will not move the keys to the front of the queue, or otherwise prevent data from being garbage collected if you don't hold onto the data itself. This will clean the internal list of items as you iterate, ensuring only valid entries are passed to the callback.

cache.forEach((value, key) => console.log(key, value));