npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

dldr

v0.0.10

Published

A tiny (367B) utility for batching and caching operations

Downloads

5,605

Readme

dldr

A tiny utility for batching and caching operations

This is free to use software, but if you do like it, consisder supporting me ❤️

sponsor me buy me a coffee

⚙️ Install

npm add dldr

🚀 Usage

The default module will batch calls to your provided loadFn witin the current tick.

Under the hood we schedule a function with queueMicrotask. That then calls your loadFn with the unique keys that have been requested.

import { load } from 'dldr';

// ⬇️ define some arbitary load method that accepts a single argument array of keys
const getPosts = (keys: string[]) => db.execute('SELECT id, name FROM posts WHERE id IN (?)', [keys]);

// .. for convenience, you could bind
const loadPost = load.bind(null, getPosts);

// ⬇️ demo some collection that is built up over time.
const posts = [
  load(getPosts, '123'),
  loadPost('123'), // functionally equivalent to the above
  load(getPosts, '456'),
];

// ...

posts.push(load(getPosts, '789'));

// ⬇️ batch the load calls, and wait for them to resolve
const loaded = await Promise.all(posts);

expect(getPosts).toHaveBeenCalledWith(['123', '456', '789']);
expect(loaded).toEqual([
  { id: '123', name: '123' },
  { id: '123', name: '123' },
  { id: '456', name: '456' },
  { id: '789', name: '789' },
]);
import { load } from 'dldr';
import { graphql, buildSchema } from 'graphql';

const schema = buildSchema(`
    type Query {
        me(name: String!): String!
    }
`);

const operation = `{
    a: me(name: "John")
    b: me(name: "Jane")
}`;

const results = await graphql({
  schema,
  source: operation,
  contextValue: {
    getUser: load.bind(null, async (names) => {
      // Assume youre calling out to a db or something
      const result = names.map((name) => name);

      // lets pretend this is a promise
      return Promise.resolve(result);
    }),
  },
  rootValue: {
    me: ({ name }, ctx) => {
      return ctx.getUser(name);
    },
  },
});

Caching

Once a key has been loaded, it will be cached for all future calls.

import { load } from 'dldr/cache';
import { getPosts } from './example';

// operates the same as the above, but will cache the results of the load method

const cache = new Map();

const loadPost = load.bind(null, getPosts, cache);
// note; cache is optional, and will be created if not provided

const posts = Promise.all([
  load(getPosts, cache, '123'),
  loadPost('123'), // will be cached, and functionally equivalent to the above
  loadPost('456'),
]);

expect(getPosts).toHaveBeenCalledTimes(1);
expect(getPosts).toHaveBeenCalledWith(['123', '456']);
expect(loaded).toEqual([
  { id: '123', name: '123' },
  { id: '123', name: '123' },
  { id: '456', name: '456' },
]);

// ⬇️ the cache will be used for subsequent calls
const post = await loadPost('123');

expect(getPosts).toHaveBeenCalledTimes(1); // still once
expect(post).toEqual({ id: '123', name: '123' });

API

Module: dldr

The main entry point to start batching your calls.

function load<T>(
  loadFn: (keys: string[]) => Promise<(T | Error)[]>,
  key: string
): Promise<T>;

Note Might be worth calling .bind if you dont want to pass your loader everywhere.

const userLoader = load.bind(null, getUsers);

await userLoader('123');

Module: dldr/cache

A submodule that will cache the results of your loadFn between ticks.

function load<T>(
  loadFn: (keys: string[]) => Promise<(T | Error)[]>,
  cache: MapLike<string, T> | undefined,
  key: string,
): Promise<T>;

A default Map based cache will be used if you dont provide one.

Self managed cache

We explicitly do not handle mutations, so if you wish to retrieve fresh entries, or have a primed cache we recommend you do so yourself. All we require is a Map like object.

Commonly an LRU cache is used, we recommend tmp-cache.

import LRU from 'tmp-cache';
import { load } from 'dldr/cache';

const loadUser = load.bind(null, getUsers, new LRU(100));

💨 Benchmark

via the /bench directory with Node v18.16.1

✔ dldr             ~ 910,576 ops/sec ± 1.34%
✔ dldr/cache       ~ 636,467 ops/sec ± 4.47%
✔ dataloader       ~ 245,602 ops/sec ± 1.34%
✔ dataloader/cache ~ 153,254 ops/sec ± 0.64%

License

MIT © Marais Rossouw