npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

data-invariants

v0.2.7

Published

A library to extract invariants over partially variable data

Downloads

20

Readme

data-invariants

A library to extract invariants over partially variable data

npm license action-ci NPM

Why?

Oftentimes, much of an applications functionality reduces to stiching together multiple API responses to achieve an objective.

This is however both a blessing and a curse. A blessing since the application is smaller, simpler and easier to reason about, meaning more correct. A curse because the appication is at the mercy of remote API responses when it comes to guaranteeing correctness.

The complexity in ensuring program correctness is reduced to the need for a data contract with the external APIs. At a system design level, one can mandate version control, schema specification and the like. However, these are outside our application's control and we are the mercy of the external API author(s).

Being the OCD engineers we are, we WILL NOT relinquish control! But... we are also lazy. Just because we want control doesn't mean we want to work too hard for it!

If this sounds like you, Welcome. Glad to have found a kindred spirit!

The rest of this document describes a simple protocol that gets us most of desired control on time/instance variant data, by imposing a "good-enough-for-most-practical-use" data contract over it.

Design

The Problem

  • We have a large blob of data, controlled by an external entity.
  • Some fields in the data, vary based on externalities - time, credentials, signatures, network latencies etc., completely outside our control.
  • We want to use that data but want predictability over it's shape and structure as time and the software-lifecycle passes.
  • Any change is typically complicated and requires manual intervention to fix code.
  • We want to minimize the work involved in change detection, focussing our efforts on corrective action.

The Solution

The solution turns out to being rather simple - find invariants of the data that we can make assertions about.

The first insight is to split the data up into two parts - {variant, invariant}, or more precisely, extract the invariant part of the data.

A second insight is that the data-shape is constant, irrespective of the actual variant data. Meaning asserting on the data-shape will yield most of the benefit with none of the pain.

This library provides just that functionality:

  const { invariant, shape } = dataInvariants(data, variantFilters);

Most modern testing tools provide some form of snapshot capability. We achieve almost all of the required data-contract with a simple snapshot.

  t.snapshot({invariant, shape});

Usage

npm install data-invariants

 import { dataInvariants } from 'data-invariants';

 // Given some data, and variantFilters, compute the invariants
 const { invariants, shape } = dataInvariants(data, variantFilters);

 // Inside a test case, snapshot the invariants for posterity.
 // We now have active monitoring of remote API responses!
 t.snapshot({ invariants, shape});

Example

I don't always provide examples, but when I do, they have to be working examples!

working example

Implementation details

Under the covers, data-invariants uses micromatch for it's filtering capabilities.

It also implements data-shape, a utility that recursively walks the data and reduces the values to one of ['string', 1, true, null]. data-shape only supports (some - no integers) of the basic JSON types, and throws on anything but [string, number, boolean, null]. The data itself can be an arbtrarily complex structure of objects/arrays.

micromatch was designed to match file paths, which means it uses '/' as the seperator - both in the input string(s) and glob-patterns. Since we have established our laziness at this point, we obviously require that you specify globPatterns in a form that micromatch likes. Please see micromatch for details.

Development Tooling

License

Apache-2.0

Code of Conduct

Please note that this project is released with a Contributor Code of Conduct. By participating in this project you agree to abide by its terms.

Support

Bugs, PRs, comments, suggestions welcomed!