npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@beoe/sqlitecache

v0.0.1

Published

persistent LRU TTL cache based on SQLite

Downloads

105

Readme

@beoe/sqlitecache

Motivational example

Let's say you have static site generator and it generates Open Graph images. But this is costly operation and images do not change often, so you want to cache them between runs.

The simplest solution is to cache them on the file system, for example, like this: ImageCache + shorthash + deterministicString (from deterministic-object-hash). But this approach lacks following features:

  • auto-cleanup: if you did some experimentation and generated images that will never be used, they will never be cleaned up, unless you delete whole cache. One of the ways to solve it is, for example, LRU (Least Recently Used)
  • expiration: what if instead of generating images locally I want to download them from remote destination and I want to re-download fresh version from time to time. One of the ways to solve it is, for example, TTL (Time To Live) aka expiration date for items
  • efficiency: file-system is not the most effective solution for the cache

Why SQLite?

I'm looking for something more efficiencient than file system. And this is exactly what SQLite is for:

Think of SQLite not as a replacement for Oracle but as a replacement for fopen()

-- About SQLite

But why not something else? I didn't find anything better that fits following criteria:

  • embeded. This disqualifies: Redis, Memcache and similar
  • synchronous. This disqualifies: RocksDB, LevelDB (at least their node bindings are asynchronous) and similar
  • persistent. This disqualifies: lru-native2, flru and similar

So here we are... I didn't do any benchmarks though

About the code

This is slightly modified version of bun-sqlite-cache, which is modified version of cache-sqlite-lru-ttl. So most of the code written by the authors of original packages. Thank you.

Other ideas

Here are some ideas to experiment with (but need proper benchmark first):

  • is it better to use raw key or hash them to shorter (and maybe integer) version
    • @node-rs/xxhash, cyrb53
  • serialization by default done with v8.serialize. How it compares to others:
    • seqproto, sia, CBOR, msgpack
    • no need to change default, because it can be configured with serialize/deserialize options
  • there is an option to enable compression. By default it will use node:zlib, because it doesn't require additional dependendenies. On the other side there are more interesting ways to do it:
    • Instead of compressing each value separately we can compress whole database:
    • There are more modern compression algorithms, for example:
      • LZ4: https://github.com/antoniomuso/lz4-napi or https://github.com/PSeitz/lz4-wasm
      • Zstandard: https://github.com/OneIdentity/zstd-js or https://github.com/bokuweb/zstd-wasm
    • no need to change default, because it can be configured with compress/decompress options
  • Maybe support Bun and Deno, like in great.db
  • Maybe support caching promises
    • untill promise resolved, cache would return the same promise from flru
    • as soon as promise resolved value would go to the main cache
    • if process would terminate before promise resolved it want be stored in the main (persistent) cache

TODO

  • cache suppose to reset when those changed: compress, decompress, serialize, deserialize
    • or store them as part of the key, so one can use several versions at the same time
  • test all new options: compress, decompress, serialize, deserialize, readonly
  • maybe store created_at for items
  • maybe drop withMeta?
  • write "usage" section of documentation