npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@mundanesoftware/file-uploader

v1.1.7

Published

A robust library for uploading large files to Cloud Storage with support for chunked uploads, retries, resumable uploads, and logging.

Downloads

392

Readme

File Uploader

A robust library for uploading large files to Cloud Storage with:

  • Chunked uploads
  • Retry mechanism with exponential backoff
  • Resumable uploads
  • Dynamic concurrency adjustment based on network conditions
  • Custom logging support

Installation

npm install file-uploader

Initialisation

You can configure the Uploader with the following options:

  • maxConcurrentUploads: The maximum number of concurrent chunk uploads.
  • destinationResolver: A function that resolves the destination URL for each file.
  • refreshSasToken: A function that returns a refreshed SAS token for secure uploads.
  • infoLogger: (Optional) Custom function to handle info logs.
  • errorLogger: (Optional) Custom function to handle error logs.

Basic Example

import Uploader from 'file-uploader';

// Initialize the uploader
const uploader = new Uploader({
    maxConcurrentUploads: 5,
    destinationResolver: async (file) => {
        const datasetName = file.dataset || 'default-dataset';
        return `https://myaccount.blob.core.windows.net/${datasetName}`;
    },
    refreshSasToken: async (fileName) => {
        const response = await fetch(`/api/refresh-sas?file=${fileName}`);
        const data = await response.json();
        return data.sasToken;
    },
    infoLogger: console.info,
    errorLogger: console.error,
});

// Handle files
const files = [
    { name: 'file1.txt', dataset: 'dataset1', size: 1024 },
    { name: 'file2.txt', dataset: 'dataset2', size: 2048 },
];

// Start the upload process
uploader.uploadFiles(files)
    .then(() => console.log('All files uploaded successfully!'))
    .catch((err) => console.error('Error uploading files:', err));

Resumable Upload Example

The uploader supports resumable uploads for both network interruptions and user-initiated pauses.

// Pause a file upload
uploader.pauseUpload('file1.txt');

// Resume the paused file upload
uploader.resumeUpload({ name: 'file1.txt', dataset: 'dataset1', size: 1024 });

Event Listeners

You can listen to the following events emitted by the Uploader:

  • fileStart: Fired when a file starts uploading.
  • fileProgress: Fired periodically to indicate the upload progress of a file.
  • fileComplete: Fired when a file finishes uploading.
  • chunkProgress: Fired for individual chunk upload progress.
  • error: Fired when an error occurs.
uploader.on('fileStart', (data) => console.log(`Starting upload for ${data.fileName}`));
uploader.on('fileProgress', (data) => console.log(`${data.fileName} is ${data.progress}% complete`));
uploader.on('fileComplete', (data) => console.log(`${data.fileName} completed successfully`));
uploader.on('error', (error) => console.error('Upload error:', error));

Dynamic Concurrency Adjustment

The uploader automatically adjusts concurrency based on the user's network conditions.

  • Fast Network (4G and above): Increases concurrency for faster uploads.
  • Slow Network (3G, 2G): Reduces concurrency to prevent overload.
  • No Network Information: Defaults to maxConcurrentUploads.

Logging

You can provide custom logging functions to integrate with external logging systems like Sentry.

const infoLogger = (message, data) => {
    // Custom log handling
    console.log(`[INFO]: ${message}`, data);
};

const errorLogger = (message, error) => {
    // Custom error handling
    console.error(`[ERROR]: ${message}`, error);
};

const uploader = new Uploader({
    maxConcurrentUploads: 3,
    destinationResolver: async (file) => `https://myaccount.blob.core.windows.net/${file.dataset}`,
    refreshSasToken: async (fileName) => 'YOUR_SAS_TOKEN',
    infoLogger,
    errorLogger,
});

Advanced Retry Mechanism

The uploader retries failed chunk uploads with exponential backoff:

  • Max Retries: 3 (default, configurable).
  • Backoff Delay: Starts at 500ms and doubles with each attempt.
async uploadChunk(chunk, uploadUrl, index, maxRetries = 3, delay = 500) {
    const config = {
        headers: {
            'x-ms-blob-type': 'BlockBlob',
        },
    };

    for (let attempt = 1; attempt <= maxRetries; attempt++) {
        try {
            await axios.put(uploadUrl, chunk, config);
            return; // Exit on success
        } catch (error) {
            if (attempt === maxRetries) {
                throw error; // Throw error after max retries
            }

            const backoff = delay * Math.pow(2, attempt - 1);
            await new Promise((resolve) => setTimeout(resolve, backoff));
        }
    }
}