npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@json2csv/node

v7.0.6

Published

Node.js Transform and Async interface to convert JSON into CSV.

Downloads

296,155

Readme

@json2csv/node

npm version npm monthly downloads Node.js CI Coverage Status license

Fast and highly configurable JSON to CSV converter. It fully support conversion following the RFC4180 specification as well as other similar text delimited formats as TSV.

@json2csv/node exposes two modules to integrate json2csv with the Node.js Stream API for stream processing of JSON data.

This package includes two modules:

  • Node Transform: Node.js Transform Stream that ingests JSON and outputs csv. Ideal to process streams (http responses, file contents, ...) in Node.js.
  • Node Async Parser: Wraps the Node Transform to offer a friendly promise-based API.

Features

  • Fast and lightweight
  • Support for standard JSON as well as NDJSON
  • Scalable to infinitely large datasets (using stream processing)
  • Advanced data selection (automatic field discovery, underscore-like selectors, custom data getters, default values for missing fields, ...)
  • Support for custom input data transformation
  • Support for custom csv cell formatting.
  • Highly customizable (supporting custom quotation marks, delimiters, eol values, etc.)
  • Automatic escaping (preserving new lines, quotes, etc.)
  • Optional headers
  • Unicode encoding support
  • Pretty printing in table format to stdout

Other json2csv packages

There are multiple flavours of json2csv:

  • Plainjs: Includes the Parser API and a new StreamParser API which doesn't the conversion in a streaming fashion in pure js.
  • Node: Includes the Node Transform and Node Async Parser APIs for Node users.
  • WHATWG: Includes the WHATWG Transform Stream and WHATWG Async Parser APIs for users of WHATWG streams (browser, Node or Deno).
  • CLI: Includes the CLI interface.

And a couple of libraries that enable additional configurations:

  • Transforms: Includes the built-in transforms for json2csv (unwind and flatten) allowing the using to transform data before is parsed.
  • Formatters: Includes the built-in formatters for json2csv (one for each data type, an excel-specific one, etc.). Formatters convert JSON data types into CSV-compatible strings.

Requirements

  • Node v16+

Installation

NPM

You can install json2csv as a dependency using NPM.

$ npm install --save @json2csv/node

Yarn

You can install json2csv as a dependency using Yarn.

$ yarn add --save @json2csv/node

Node Transform

For Node.js users, the Streaming API is wrapped in a Node.js Stream Transform. This approach ensures a consistent memory footprint and avoids blocking JavaScript's event loop.

The async API takes a second options arguments that is directly passed to the underlying streams and accepts the same options as the standard Node.js streams, plus the options supported by the Stream Parser.

This Transform uses the StreamParser under the hood and support similar events.

Usage

import { createReadStream, createWriteStream } from 'fs';
import { Transform } from '@json2csv/node';

const input = createReadStream(inputPath, { encoding: 'utf8' });
const output = createWriteStream(outputPath, { encoding: 'utf8' });

const opts = {};
const transformOpts = {};
const asyncOpts = {};
const parser = new Transform(opts, asyncOpts, transformOpts);

const processor = input.pipe(parser).pipe(output);

// You can also listen for events on the conversion and see how the header or the lines are coming out.
parser
  .on('header', (header) => console.log(header))
  .on('line', (line) => console.log(line));

Parameters

Options
  • ndjson <Boolean> indicates that the data is in NDJSON format. Only effective when using the streaming API and not in object mode.
  • fields <DataSelector[]> Defaults to toplevel JSON attributes.
  • transforms <Transform[]> Array of transforms to apply to the data. A transform is a function that receives a data recod and returns a transformed record. Transforms are executed in order.
  • formatters <Formatters> Object where the each key is a Javascript data type and its associated value is a formatters for the given type.
  • defaultValue <Any> value to use when missing data. Defaults to <empty> if not specified. (Overridden by fields[].default)
  • delimiter <String> delimiter of columns. Defaults to , if not specified.
  • eol <String> overrides the default OS line ending (i.e. \n on Unix and \r\n on Windows).
  • header <Boolean> determines whether or not CSV file will contain a title column. Defaults to true if not specified.
  • includeEmptyRows <Boolean> includes empty rows. Defaults to false.
  • withBOM <Boolean> with BOM character. Defaults to false.
Transform Options

See the Duplex stream options for more details.

Async Options

Options used by the underlying parsing library to process the binary or text stream. Not relevant when running in objectMode. Buffering is only relevant if you expect very large strings/numbers in your JSON. See @streamparser/json for more details about buffering.

  • stringBufferSize <number> Size of the buffer used to parse strings. Defaults to 0 which means to don't buffer. Min valid value is 4.
  • numberBufferSize <number> Size of the buffer used to parse numbers. Defaults to 0 to don't buffer.

Complete Documentation

See https://juanjodiaz.github.io/json2csv/#/parsers/node-transform.

Node Async Parser

To facilitate usage, NodeAsyncParser wraps NodeTransform exposing a single parse method similar to the sync API. This method accepts JSON arrays/objects, TypedArrays, strings and readable streams as input and returns a stream that produces the CSV.

NodeAsyncParser also exposes a convenience promise method which turns the stream into a promise that resolves to the whole CSV.

Usage

import { AsyncParser } from '@json2csv/node';

const opts = {};
const transformOpts = {};
const asyncOpts = {};
const parser = new AsyncParser(opts, asyncOpts, transformOpts);

const csv = await parser.parse(data).promise();

// The parse method return the transform stream.
// So data can be passed to a writable stream (a file, http request, etc.)
parser.parse(data).pipe(writableStream);

Parameters

Options
  • ndjson <Boolean> indicates that the data is in NDJSON format. Only effective when using the streaming API and not in object mode.
  • fields <DataSelector[]> Defaults to toplevel JSON attributes.
  • transforms <Transform[]> Array of transforms to apply to the data. A transform is a function that receives a data recod and returns a transformed record. Transforms are executed in order.
  • formatters <Formatters> Object where the each key is a Javascript data type and its associated value is a formatters for the given type.
  • defaultValue <Any> value to use when missing data. Defaults to <empty> if not specified. (Overridden by fields[].default)
  • delimiter <String> delimiter of columns. Defaults to , if not specified.
  • eol <String> overrides the default OS line ending (i.e. \n on Unix and \r\n on Windows).
  • header <Boolean> determines whether or not CSV file will contain a title column. Defaults to true if not specified.
  • includeEmptyRows <Boolean> includes empty rows. Defaults to false.
  • withBOM <Boolean> with BOM character. Defaults to false.
Transform Options

See the Duplex stream options for more details.

Async Options

Options used by the underlying parsing library to process the binary or text stream. Not relevant when running in objectMode. Buffering is only relevant if you expect very large strings/numbers in your JSON. See @streamparser/json for more details about buffering.

  • stringBufferSize <number> Size of the buffer used to parse strings. Defaults to 0 which means to don't buffer. Min valid value is 4.
  • numberBufferSize <number> Size of the buffer used to parse numbers. Defaults to 0 to don't buffer.

Complete Documentation

See https://juanjodiaz.github.io/json2csv/#/parsers/node-async-parser.

License

See LICENSE.md.