npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

when-json-met-bigint

v0.27.0

Published

Forked from json-bigint, rewritten in TS

Downloads

136

Readme

Generated from README.raw.md, any modifications will be overwritten.

when-json-met-bigint

Build Status

NPM

This is a fork of json-bigint, rewritten in TypeScript, no longer use bignumber.js, and being actively maintained. It also supports an ad-hoc parser API to solve bigint/number determinism.

Compare to json-bigint, when-json-met-bigint try its best to be "default JSON API compliant", all custom behaviours are opt-in through options (e.g options.protoAction 'preserve' instead of 'error' by default). Performance-wise this package might be theoretically a little bit faster thanks to cache, depends on use cases.

Compare to default JSON on non-bigint objects, small fixes were made regarding type definition to align with the spec, for example JSONB.stringify return undefined if called with symbol or Function as value.

Implemented following ES2022 JSON.parse/stringify specification.

All exports are named to avoid CommonJS/ESM default export complication.

Compability: ES6 and above.

==========

JSON.parse/stringify with ES2020 BigInt support. Based on Douglas Crockford JSON.js and json-bigint.

While most JSON parsers assume numeric values have same precision restrictions as IEEE 754 double, JSON specification does not say anything about number precision. Any floating point number in decimal (optionally scientific) notation is valid JSON value.

It's a good idea to serialize values which might fall out of IEEE 754 integer precision as strings in your JSON api, but { "value" : 9223372036854775807}, for example, is still a valid RFC4627 JSON string, and in most JS runtimes the result of JSON.parse is this object: { value: 9223372036854776000 }

==========

example:

var JSONB = require("when-json-met-bigint").JSONB;
// or var { JSONB, parse, stringify } = require('when-json-met-bigint');

var json = '{ "value" : 9223372036854775807, "v2": 123 }';
console.log("Input:", json);
console.log("");

console.log("node.js built-in JSON:");
var r = JSON.parse(json);
console.log("JSON.parse(input).value : ", r.value.toString());
console.log("JSON.stringify(JSON.parse(input)):", JSON.stringify(r));

console.log("\n\nbig number JSON:");
var r1 = JSONB.parse(json);
console.log("JSONB.parse(input).value : ", r1.value.toString());
console.log("JSONB.stringify(JSONB.parse(input)):", JSONB.stringify(r1));

Output:

Input: { "value" : 9223372036854775807, "v2": 123 }

node.js built-in JSON:
JSON.parse(input).value :  9223372036854776000
JSON.stringify(JSON.parse(input)): {"value":9223372036854776000,"v2":123}


big number JSON:
JSONB.parse(input).value :  9223372036854775807
JSONB.stringify(JSONB.parse(input)): {"value":9223372036854775807,"v2":123}

JSONB.parse(text[, reviver[, schema]])

JSONB.parse support a 3rd option, which is a schema-like object. This is an ad-hoc solution for the limitation o !== JSONB.parse(JSONB.stringify(o))

This limitation exists because JS treats bigint and number as 2 separate types which cannot be cooerced. The parser choses an appropriate type based on the size of the number in JSON string. This introduces 2 problems:

  • As stated above, JSONB.parse(JSONB.stringify(123n)) returns 123 because the number is small enough
  • The type of one field is not consistent, for example one API can return a response in which a field can sometimes be bigint and other times be number

There's the option to parse all number as bigint but IMHO this isn't much desirable. Libraries solved (2) by iterating the parsed result and enforce the type as you can see here. That PR has an interesting approach which this solution is inspired from.

In order to overcome the limitation, we need a parser's API so that users can specify per case & per field whether it should be bigint or number. This API is exposed through a schema-like object. Its type is defined as following;

type Schema =
    | `number`
    | `bigint`
    | ((n: number | bigint) => `number` | `bigint`)
    | (Schema | null)[]
    | { [key: string | number | symbol]: Schema };

To put it simple, the schema-like argument is an object with fields and sub-fields being the fields and sub-fields of the expected parsed object, following the same structure, for which users want to specify whether to force it as bigint or number. symbol type keys are used for special meanings to avoid JSON keys clashing.

Those fields can take 3 different values, a string 'number' or 'bigint' meaning it will be parsed using Number or BigInt, respectively. The 3rd possible value is a callback function (n: number | bigint) => 'number' | 'bigint', with n being either number or bigint as being parsed by default depending on the underlying JSON string. Users for example can use this callback to throw Error in case the type is not what they're expecting.

To omit the key, aka define a schema for any key in an object, use the symbol value Symbol.for('any') as key, like this { [Symbol.for('any')]: 'bigint'}. You MUST NOT use Symbol('any') since Symbol('any') !== Symbol.for('any'), which is used to index the schema. The Symbol.for('any')'s schema, if presents, is only used when the specified key's one does not exist (for any key such that schema[key] === undefined).

For Array in the schema-like object, a single item array is treated as T[], that is the item will be the schema for all items in the parsed array. An array with multiple items in the schema-like object will be used as tuple type, that is each of the item with be the schema for the corresponding index item in the parsed array. If parsed_array.length > schema_array.length, the parsed array's items which has no corresponding index in the schema array will be parsed as having no schema.

If a value different from those defined above passed in or returned from the callback, it is as if there is no schema.

To aid with Typescript support, the package also export a Schema<T> type definition, given an optional generic parameter, will infer the appropriate structure type for the schema. If no type given it returns the Schema type defined above. The generic type parameter can also be passed to the parse<T> function if wanted to have similar effect, that is to constraint the type of the schema argument.

example:

JSONB.parse(`{"a": {"b": 123} }`, null, { a: { b: `bigint` } }); // returns {a: {b: 123n} }
JSONB.parse(`{"a": {"b": 123} }`, null, {
    a: {
        b: (n) => {
            if (typeof n === `number`)
                throw new Error(`Expect bigint but found ${n}`);
            return `bigint`;
        },
    },
});
JSONB.parse(`{"a": [1, 2, 3] }`, null, { a: [`bigint`] }); // returns {a: [1n, 2n, 3n] }
JSONB.parse(`{"a": [1, 2, 3] }`, null, { a: [`bigint`, `bigint`] }); // returns {a: [1n, 2n, 3] }
JSONB.parse(`{"a": [1, 2, 3] }`, null, { a: [`bigint`, null] }); // returns {a: [1n, 2, 3] }
JSONB.parse(`{"a": [1, 2, 3] }`, null, { a: [null, null, `bigint`] }); // returns {a: [1, 2, 3n] }
JSONB.parse<{ a: number[] }>(`{"a": [1, 2, 3] }`, null, {
    a: [null, `something else`, `bigint`],
}); // compilation error

JSONB.stringify(value[, replacer[, space]])

Full support out-of-the-box, stringifies bigint as pure numbers (no quotes, no n)

Options

==========

  • options.errorOnBigIntDecimalOrScientific, boolean, default false

By default, decimal number or scientific notation cannot be parsed using BigInt, aka BigInt("1.23") or BigInt("1e23") will throw. This option controls what to do when forcing BigInt through schema or option.alwaysParseAsBigInt upon such value.

if not specified, then the option.alwaysParseAsBigInt or schema will be ignored for such value and it is parsed as number. Otherwise it will throw.

example:

var JSONB = require("when-json-met-bigint").JSONB({
    alwaysParseAsBigInt: true,
    errorOnBigIntDecimalOrScientific: true,
});
JSONB.parse('{ "decimal": 1.23, "scientific": 1e23 }');

Output

throw 'Decimal and scientific notation cannot be bigint'

==========

  • options.errorOnDuplicatedKeys, boolean, default false

Specify the parsing should throw on duplicated keys in the parsed string. The default follows what is allowed in default JSON and resembles the behavior of JSON.parse, but overwrites any previous values with the last one assigned to the duplicated key.

Setting options.errorOnDuplicatedKeys = true will fail-fast on such duplicated keys occurance and thus warn you upfront of possible lost information.

example:

var JSONB = require("when-json-met-bigint").JSONB;
var JSONBeodk = require("when-json-met-bigint").JSONB({ strict: true });

var dupkeys = '{ "dupkey": "value 1", "dupkey": "value 2"}';
console.log("\n\nDuplicate Key test with both lenient and strict JSON parsing");
console.log("Input:", dupkeys);
var works = JSONB.parse(dupkeys);
console.log("JSON.parse(dupkeys).dupkey: %s", works.dupkey);
var fails = "will stay like this";
try {
    fails = JSONBeodk.parse(dupkeys);
    console.log("ERROR!! Should never get here");
} catch (e) {
    console.log(
        "Succesfully catched expected exception on duplicate keys: %j",
        e,
    );
}

Output

Duplicate Key test with big number JSON
Input: { "dupkey": "value 1", "dupkey": "value 2"}
JSON.parse(dupkeys).dupkey: value 2
Succesfully catched expected exception on duplicate keys: {"name":"SyntaxError","message":"Duplicate key \"dupkey\"","at":33,"text":"{ \"dupkey\": \"value 1\", \"dupkey\": \"value 2\"}"}

==========

  • options.strict, boolean, default false

options.errorOnBigIntDecimalOrScientific = true & options.errorOnDuplicatedKeys = true

==========

  • options.alwaysParseAsBigInt, boolean, default false

Specify if all numbers should be stored as bigint. Careful that this option can be overwritten by a schema if present.

Note that this is a dangerous behavior as it breaks the default functionality of being able to convert back-and-forth without data type changes (as this will convert all number to be-and-stay bigint)

example:

var JSONB = require("when-json-met-bigint").JSONB;
var JSONBalways = require("when-json-met-bigint").JSONB({
    alwaysParseAsBigInt: true,
});
var key = '{ "key": 123 }'; // there is no need for BigInt by default, but we're forcing it
console.log(`\n\nStoring the number as a bigint, instead of a number`);
console.log("Input:", key);
var normal = JSONB.parse(key);
var always = JSONBalways.parse(key);
console.log(
    "Default type: %s, With option type: %s",
    typeof normal.key,
    typeof always.key,
);

Output

Storing the number as a bigint, instead of a number
Input: { "key": 123 }
Default type: number, With option type: bigint

==========

  • options.parseBigIntAsString, boolean, default false

Specify if bigint should be stored in the object using String, rather than the default BigInt. This option is applied if the type AFTER ALL OPTIONS & SCHEMA APPLIED is bigint. That is, if used with options.alwaysParseAsBigInt === true, ALL number will be parsed as string.

Note that this is a dangerous behavior as it breaks the default functionality of being able to convert back-and-forth without data type changes (as this will convert all bigint to be-and-stay string).

example:

var JSONB = require("when-json-met-bigint").JSONB;
var JSONBstring = require("when-json-met-bigint").JSONB({
    parseBigIntAsString: true,
});
var key = '{ "key": 1234567890123456789 }';
console.log("\n\nStoring the bigint as a string, instead of a bigint");
console.log("Input:", key);
var withInt = JSONB.parse(key);
var withString = JSONBstring.parse(key);
console.log(
    "Default type: %s, With option type: %s",
    typeof withInt.key,
    typeof withString.key,
);

Output

Storing the bigint as a string, instead of a bigint
Input: { "key": 1234567890123456789 }
Default type: object, With option type: string

==========

  • options.protoAction, default: "preserve". Possible values: "error", "ignore", "preserve"
  • options.constructorAction, default: "preserve". Possible values: "error", "ignore", "preserve"

Control how __proto__ and constructor properties are treated. If set to "error" they are not allowed and parse() call will throw an error. If set to "ignore" the prroperty and its value is skipped from parsing and object building.

If set to "preserve" the __proto__ property is set. However, this DOES NOT set the prototype because the base object would be created using Object.create(null) and Object.setPrototypeOf is called upon later, for this particular case only to prevent prototype poisoning. One still should be extra careful and make sure any other library consuming generated data is not vulnerable to prototype poisoning attacks.

example:

var JSONBno__proto__ = require("when-json-met-bigint").JSONB({
    protoAction: "ignore",
});
const user = JSONBno__proto__.parse(
    '{ "__proto__": { "admin": true }, "id": 12345 }',
);
// => result is { id: 12345 }
var JSONB__proto__ = require("when-json-met-bigint").JSONB;
const user = JSONBno__proto__.parse(
    '{ "__proto__": { "admin": true }, "id": 12345 }',
);
// => result is { id: 12345, __proto__: { admin: true } } but user.admin === undefined

TODO:

  • Align error message with default JSON
  • Add turbo mode

Links:

Limitations

  • Roundtrip operations

s === JSONB.stringify(JSONB.parse(s)) but

o !== JSONB.parse(JSONB.stringify(o)) solved with schema argument

when o has a value with something like 123n.

JSONB stringify 123n as 123, which becomes number (aka 123 not 123n) by default when being reparsed.

If the schema is not provided, then there is currently no other consistent way to deal with this issue.

Benchmark

Using benny, when-json-met-bigint vs JSON + Regex

  • parse big obj no BigInt
  • parse big obj with BigInt
  • parse small obj with BigInt
  • stringify big obj no BigInt
  • stringify big obj with BigInt
  • stringify small obj with BigInt