npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

remote-file-zipper

v0.9.1

Published

A package that downloads and zips images from image url(s).

Downloads

2

Readme

Remote File Zipper

This package allows the client to pass a large array of files to zip them.

Installation

yarn add remote-file-zipper

or

npm i remote-file-zipper

Usage

const zipper = require("remote-file-zipper");
const path = require("path");
const fs = require("fs");

const payload = {
  filename: "myZippedFile.zip",
  files: [
    {
      filename: "image.png",
      url: "https://images.theconversation.com/files/350865/original/file-20200803-24-50u91u.jpg",
    },
  ],
  queueLength: 100,
};

zipper
  .zip(payload)
  .then(({ zipFileName, zipReadableStream, statusEmitter }) => {
    statusEmitter.on("warning", (error) => {
      console.log(error);
    });

    statusEmitter.on("error", (error) => {
      const singleFileError = typeof error.file !== "undefined";

      if (singleFileError) {
        // There was an error with one file, the rest zips as normal
        console.log(error.message, error.file);
      } else {
        // There was an error with the zipping as a whole, exits.
        console.log(error);
      }
    });

    const output = fs.createWriteStream(path.join(__dirname, zipFileName), {
      encoding: "binary",
    });

    output.on("error", (error) => {
      // There was an error writing your file
      console.log(error);
    });

    output.on("close", () => {
      console.log("Zip successfully written.");
    });

    zipReadableStream.pipe(output);
  })
  .catch((error) => {
    // There was some uncaught error
    console.log(error);
  });

Methods

Zip

zip({ filename: String, files: Array<{filename: String, url: String}>, queueLength: Number }) : Promise()
  • filename: The name of the final zipped file. default: test.zip
  • files: The file object required
    • filename: The name of the file within the zip. required
    • url: The URL of the remote file. required
  • queueLength: The size of each request queue. default: 100, max: 500.

queueLength information

By default the libray implements a batched request to the files and simultaneuously processes that amount of buffers in memory. If your remote files are large, it's recommended to lower the queueLength to avoid memory and remote server problems. If you're experiencing issues with rate e.g. making too many simultaneuous requests to a file - try lower this too.

Performance

You can test it yourself by cloning the repo and running:

yarn && yarn test
NB: These tests run a local server that serves a single image, one test requests this image 10,000 times which consumes around 1.9GB of storage locally, but proves the resillience.

Contributions

Just do the thing. No guidelines yet. 🤠

Support

Buy me a coffee