npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

multerator

v0.11.0

Published

Multerator (short for _multipart-iterator_) is a `multipart/form-data` parser for Node.js.

Downloads

18

Readme

Overview

Multerator (short for multipart-iterator) is a multipart/form-data parser for Node.js.

Compatible for Node.js versions >= 10.21.0.

This is an initial README and more documentation will be eventually added.

Installation

With npm:

npm install multerator

With yarn:

yarn add multerator

Synopsis

const { multerator } = require('multerator');

(async () => {
   // Obtain a multipart data stream:
  const stream = getSomeMultipartStream();
  const boundary = '--------------------------120789128139917295588288';

  // Feed it to multerator:
  const streamParts = multerator({ input: stream, boundary });

  for await (const part of streamParts) {
    if (part.type === 'text') {
      console.log(
        `Got text field "${part.name}" with value "${part.data}"`
      );
    } else {
      console.log(
        `Got file field "${part.name}" of filename "${part.filename}" with content type "${part.contentType}" and incoming data chunks:`
      );
      for await (const chunk of part.data) {
        console.log(`Received ${chunk.length} bytes`);
      }
    }
  }
})();

API

Input parameters

| Name | Type | Description | | :--- | :--- | :--- | | options | object (required) | | | options.input | Readable | AsyncIterable<Buffer> (required) | A Readable stream or any async iterable of Buffer objects. | | options.boundary | string (required) | The boundary token by which to separate parts across the contents of given options.input. | | options.maxFileSize | number | Default: none. Optional size limit (in bytes) for individual file part bodies. The moment this limit is reached, multerator will immediately cut the input data stream and yield an error of type ERR_BODY_REACHED_SIZE_LIMIT. | | options.maxFieldSize | number | Default: none. Optional size limit (in bytes) for individual field part bodies. The moment this limit is reached, multerator will immediately cut the input data stream and yield an error of type ERR_BODY_REACHED_SIZE_LIMIT. That's a recommended general safety measure as field part bodies are collected as complete strings in memory which might be unsafe in the case of dealing with an "unreasonable" data source. |

Usage examples

General usage:

const fs = require('fs');
const { PassThrough } = require('stream');
const FormData = require('form-data');
const { multerator } = require('multerator');

(async () => {
  // Obtain a multipart data stream with help from form-data package:
  const form = new FormData();
  form.append('my_text_field', 'my text value');
  form.append('my_file_field', fs.createReadStream(`${__dirname}/image.jpg`));
  const input = form.pipe(new PassThrough()); // Converting the form data instance into a normalized Node.js stream, which is async-iteration-friendly as required for multerator's input
  const boundary = form.getBoundary();

  // Feed it to multerator:
  try {
    for await (const part of multerator({ input, boundary })) {
      if (part.type === 'text') {
        console.log(
          `Got text field "${part.name}" with value "${part.data}"`
        );
      } else {
        console.log(
          `Got file field "${part.name}" of filename "${part.filename}" with content type "${part.contentType}" and incoming data chunks:`
        );
        for await (const chunk of part.data) {
          console.log(`Received ${chunk.length} bytes`);
        }
      }
    }
  } catch (err) {
    console.log('Multipart parsing failed:', err);
  }
})();

Very banal file upload server with Express:

const { createWriteStream } = require('fs');
const { pipeline } = require('stream');
const { promisify } = require('util');
const express = require('express');
const { multerator } = require('multerator');

const pipelinePromisified = promisify(pipeline);

const expressApp = express();

expressApp.post('/upload', async (req, res) => {
  const contentType = req.headers['content-type'];

  try {
    if (!contentType.startsWith('multipart/form-data')) {
      throw new Error(
        '😢 Only requests of type multipart/form-data are allowed'
      );
    }

    const boundary = contentType.split('boundary=')[1];

    const parts = multerator({ input: req, boundary });

    for await (const part of parts) {
      if (part.type === 'file') {
        console.log(
          `Incoming upload: field name: ${part.name}, filename: ${part.filename}, content type: ${part.contentType}`
        );
        await pipelinePromisified(
          part.data,
          createWriteStream(`${__dirname}/uploads/${part.filename}`)
        );
      }
    }

    res.status(200).send({ success: true });
  } catch (err) {
    res.status(500).send({ success: false, error: err.message });
  }
});

expressApp.listen(8080, () => console.log('Server listening on 8080'));

...callable by e.g:

curl \
  -F [email protected] \
  http://127.0.0.1:8080/upload