npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

stream-json-parse

v1.0.4

Published

A streaming JSON parser for Browser

Downloads

7

Readme

stream-json-parse

Return of large json against a web interface. Or at slow network speeds. Use stream-json-parse to parse json as it is downloaded. Allow the front-end to render part of the data in advance. Optimize user experience. Mainly use fetch streaming transfer, and generate features to achieve.

===================

example: The data returned by the backend is as follows:

{
  "code": 0,
  "data": [
    {
      "id": "64eb1b565c5214078f235c98",
      "email": "[email protected]",
      "username": "koch90".

    },
    {
      "id": "64eb1b569f47412c27a74f11", {
      "email": "[email protected]",
      "username": "norma89",
    }, }
    ...
  ]
}

If there is a lot of data in the data, the front-end needs to wait for a long time to see the data. You can use stream-json-parse to parse the data as you download it. Let the front-end render part of the data in advance. After using it, you can receive the data multiple times in the callback function. (The specific data received each time needs to be decided based on how much data is in each network packet)

// Received in the first callback:
{
  "code": 0,
  "data": [
    {
      "id": "64eb1b565c5214078f235c98",
      "email": "[email protected]",
      "username": "koch90".

    }
  ]
}
// Received in the second callback:
{
  "code": 0,
  "data": [
    {
      "id": "64eb1b565c5214078f235c98",
      "email": "[email protected]",
      "username": "koch90".

    },
    {
      "id": "64eb1b569f47412c27a74f11", {
      "email": "[email protected]",
      "username": "norma89",
    }
  ]
}

===================

Usage:

import { createJsonParseWritableStream, arrayItemSymbol } from 'stream-json-parse'

const response = await fetch(
  './bigJson1.json',
  {
    method: 'GET'
  }
)
response.body
  // You need to use TextDecoderStream to decode into utf-8, otherwise you will get binary data directly
  .pipeThrough(new TextDecoderStream())
  .pipeTo(createJsonParseWritableStream({
    // Requires complete parsing of the data in the corresponding path for reporting (optional), arrayItemSymbol represents an array item
    completeItemPath: ['data', arrayItemSymbol],
    // Callback for json parsing
    jsonCallback: (error, isDone, value) => {
      console.log('jsonCallback', error, isDone, value)
    },
    diffCallBack: (json, isEq) => {
      console.log('diffCallBack', json, isEq)
    }
  })

Parameter description:

  • createJsonParseWritableStream: The function creates a writable stream with the following parameters:
  • JSONParseOption: Parsing configuration.
    • protoAction?: 'error' | 'ignore' | 'preserve'; // Object contains forbidden prototype property.
    • constructorAction?: 'error' | 'ignore' | 'preserve'; // Object contains forbidden constructor property
    • strict?: boolean; // Whether to use strict mode for the object, disallowing duplicate keys.
    • completeItemPath?: (string|symbol)[], // Requires complete parsing of the data under the corresponding path to be reported. All that is required is an array of path keys, similar to loadsh.get. If the path is an array of items, the key is arrayItemSymbol, a Symbol that generalizes to all indexes in the array.
    • updatePeriod?: number; // The interval between updating the data (by executing the jsonCallback callback) after parsing a certain amount of data. Default 300ms.
    • jsonCallback: (error: null | Error, done?: boolean, value?: any) => void; // json parsing, each time after parsing a certain amount of data the callback is executed.
      • Callback parameters:
        • error: error message
        • done: if the parsing is done, the last data parsed.
        • value: the parsed data
    • diffCallBack?: (data: any, isEq: boolean) => void; // After json parsing is done, parse again with native JSON.parse to compare. If not written, no comparison is done. for debug.
      • Callback parameters:
        • json: the complete data parsed by the native parser
        • isEq: whether it is equal to the native parsed one, if not, the first json parameter should be used.

demo effect:

Here I set my internet speed to 3Mb/s, it takes about 1s to download this json completely.

Ordinary request:

normal request

steam request:

steam request

As you can see with the normal request, the page waits until the json is fully loaded before it starts rendering the data. There is a long white screen. With the steam request, the page parses as it goes down. The data is loaded gradually and there is almost no white screen time.

Note: If needed, you need to modify chrome developer tools, or other network tools, modify the network speed to slow3g, in order to see the effect.

demo can be viewed in the dev folder.

Parsing logic borrowed from json-bigint