npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

lru-memorise

v0.3.0

Published

A simple memorise function that uses a fast LRU cache under the hood.

Downloads

18,219

Readme

LRU Memorise Fn

Build Coverage Status

A simple memorise function that uses a fast LRU cache under the hood.

Basic features

  • Works in Node and the browser
  • Memorises both sync and async functions.
  • Automatic cache key generation from function arguments
  • Cache TTL / value expiry
  • Fully typed: memorised functions inherit their source function types.
  • Bring your own cache or cache-keygen function, or we create one for you.
  • Cache is exposed for the memorised function, if you want to access it and modify or manually clear.

Usage

import memorise from "lru-memorise";

const answerMyToughQuestion = (question: string) => {
  // ... heavy compute
  return response;
};

const memorisedFn = memorise(answerMyToughQuestion);

memorisedFn(`What's the meaning of life?`); // Calls source fn, response cached #1.
memorisedFn(`Are you an AI?`); // Calls source fn, response cached #2.

memorisedFn(`What's the meaning of life?`); // Returns cached value #1
memorisedFn(`Are you an AI?`); // Returns cached value #2

Options

Note, all options are optional

| Option | Default | Description | | ---------------- | --------------- | --------------------------------------------------------------------------------------------------------- | | cache | undefined | Bring your own LRU cache, as long as it supports standard set and get operations | | cacheKeyResolver | defaultFn | Function used to generate unique cache key from arguments. Takes array of arguments, and return a string. | | lruOptions | { max: 1000 } | Default options for tiny-lru. The two options are max and ttl. |

Bring your own cache

Under the hood we use tiny-lru, but you can bring your own, or create your cache outside and use it.

If you don't provide one, we create a cache for you. This cache is exposed through the _cache function, if you want to have access to it.

Bring your own cache key resolver

We have a simple function to generate a cache key from function arguments, using JSON.stringify. You can of course pass your own custom one if you have a more complex situation, or want to cache to behave differently. It should return a string.

type Resolver = (...args: any[]) => string;

Tiny-LRU Options

By default, we limit the max cache size to 1000 keys, and we don't set a TTL on the key values.

Type LRUOPtions = {
  max: number // Max number of keys in Cache
  ttl: number // Expiry of items in cache
}