npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@ericwong3/array-compress

v1.0.4

Published

Compress and compact array of objects, whether dense or sparse

Downloads

5

Readme

ArrayCompress

Compress array of objects by deduplicating object key's. Useful when storing tabular data under limited space (e.g. local storage).

Example

Given an array of data, e.g.

[
    {name: 'Alice', address: 'Taiwan'},
    {name: 'Bob', address: 'United States'},
]

The ArrayCompress can convert it into array of array and strip away the duplicate keys, i.e.

Keys: ['name', 'address']
Values: [['Alice', 'Taiwan'], ['Bob', 'United State']]

Features

  • No dependency
  • Works well with sparse data (e.g. each row has different property)
  • Works with data containing null and undefined
  • Covered by tests

Usage

var ArrayCompress = require('@ericwong3/array-compress');
var compressor = new ArrayCompress();

var myData = [
    {name: 'Alice', address: 'Taiwan', age: 21, favouriteColor: 'red'},
    {name: 'Bob', address: 'United States', age: 13, favouriteColor: 'yellow'},
    {name: 'Charlie', address: 'Hong Kong', age: 55, favouriteColor: 'green'},
    {name: 'Dave', address: 'Japan', age: 87, favouriteColor: 'blue'},
];

var compressed = compressor.compress(myData);
/* compressed = {
    ks: ['name', 'address', 'phone', 'favouriteColor'],
    kv: [
        {c: ['Alice', 'Taiwan', 21, 'red']},
        {c: ['Bob', 'United States', 13, 'yellow']},
        {c: ['Charlie', 'Hong Kong', 55, 'green']},
        {c: ['Dave', 'Japan', 87, 'blue']}
    ]
} */

console.log(JSON.stringify(myData).length);     // Raw Length: 287
console.log(JSON.stringify(compressed).length); // Compressed length: 205, reduced by 28%

var uncompressedData = compressor.decompress(compressed);
// uncompressedData will be equal to myData

Compression Ratio

The compression ratio largely depends on number of columns and rows. Since the approach is to deduplicate column keys, therefore compression ratio increase as number of columns or rows increase. For any moderately sized tabular, we expect to see at least 25% reduction in size.

In the above example, size is cut down to ~72%.

One sample candidate application would be LINE Rangers Handbook, their rangers (game units) API returns the data in array of object, where each array item is one unit. They have a total of 1300+ units (rows) and 64 attributes (keys) per object, this result in a huge 2MB+ JSON which is quite big for local storage. By using this compression method, we can compress the JSON from a whopping 1.98MB to 453KB, a 77% decrease which is much more decent for local storage.

Conversely, it is NOT recommended for use with small dataset (i.e. <5 columns / <5 rows) as the compressed result might be even bigger than uncompressed.

Advanced Usage

Working with sparse data

This program automatically scans the data and determine if each row is sparse, and applies a field mapping logic to leave out non-defined property from the compressed output. (Sparse is defined as "not containing all properties from all rows")

As an example, if we compress:

var myData = [
    {a: 1,         b: 2,  c: 5   },
    {a: 1,                       }, // sparse
    {a: undefined, b: 7,         }, // sparse
    {              b: 4,         }, // sparse
    {                     c: 10  }, // sparse
    {a: 50,        b: 51, c: null},
]

The producted output will be:

{
    ks: ['a','b','c'],
    vs: [
        { c: [1, 2, 5]             },
        { c: [1],            m: '1'},
        { c: [undefined, 7], m: '3'},
        { c: [4],            m: '2'},
        { c: [10],           m: '4'},
        { c: [50, 51, null]        },
    ]
}

For first and last row, since it contains all properties, m field will be omitted from output. And for the sparse rows, the m field will be used to denote how the values in c is being mapped to keys defined in ks. For the inner working of m field, please consult the source code.