npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

s3-upload-streams

v1.0.4

Published

A promise besed paralel tool for uploading nodejs streams to s3.

Downloads

53

Readme

s3-upload-streams

s3-upload-streams provides a thin wrapper on top of aws-sdk that allows multiple concurrent uploads of nodejs.stream.Readable streams to S3.

Benefits

  • a standard way for manipulating data streams
  • an improved control flow using promises
  • multiple concurrent uploads
  • no prior knowledge of data size required

The uploader accepts a readable block stream as a parameter that enables the utlization of the full power of the streaming interface such as on-the-fly compression and encryption of data using the standard crypto and zlib modules. Flow and object mode are not supported, however, piping to a PassThrough stream provides a nice workaround. For example:

objectStream.pipe(passThroughStream)

Installation

npm i --save s3-upload-streams

Basic Usage

  let Uploader = require('s3-upload-streams');
  let someFilePath = 'foo.txt';
  let s3Uploader = new Uploader(s3, bucket, partSize, maxConcurentUploads);
  let stream = fs.createReadStream(someFilePath);
  let uploadIdPromise = s3Uploader.startUpload({ Key: 'Amazon S3 Object Key' }, stream, { orginalPath: someFilePath });

  stream.on('end', () => {
    uploadIdPromise
    .then(uploadId => s3Uploader.completeUpload(uploadId))
    .then((metadata) => {
      console.log(`Uploaded ${metadata.additionalMetadata.orginalPath} to ${metadata.location}`);
      currentUploads = currentUploads - 1;
      tryNext();
    })
    .catch(err => console.log(err));
  });

See more example code here

constructor

Initialize new S3Uploader.

Params

s3 - An instance of an Amazon aws-sdk S3 object.

bucket - Name of the default S3 bucket for the specific S3 uploader. Can be overwritten by using the s3Params parameter of the startUpload method.

partSize - The size of the data chunk that will be read from the stream and will be passed to the Amazon uploader. Use this to optimize upload speed and memory usage.

concurrency - Maximum number of concurrent Amazon uploads.

s3Uploader.startUpload

Initialize new object upload in the specific s3Uploader instance.

Params

s3Params - An instance of createMultipartUpload params passed directly to Amazon. See the full list of parameters in Amazon's documentation.

readable - An instance of nodes Readable, Duplex or Transform stream. Be aware that the stream must be paused and not in object mode.

additionalMetadata - Metadata that will be passed back to the caller when the current upload is complete.

partialUploadParams - A key-value set of parameters for manipulation of partially uploaded objects not initiated by this s3Uploader. Available parameters are:

UploadId - UploadId of an already started multipartUpload

Parts - List of already uploaded parts of this object. The uploader will use this list when it completes the upload. See Advanced Usage section for more information.

s3Uploader.completeUpload(id)

Complete the current object upload.

Params

id - id of the current upload returned by startUpload method.

Returns

A promise that resolves to uploaded object metadata. Available information is:

location - URL of this object.

bucket - The S3 Bucket.

key - The key in the S3.

etag - Entity tag of the object.

additionalMetadata - The metadata passed in startUpload.

size - Size in bytes.

s3Uploader.abortUpload(id)

Abort the current object upload.

Params

id - id of the current upload returned by startUpload method.

Returns

A promise that resolves to abortMultipartUpload response.

s3Uploader.complateAll()

Complete all uploads started by this s3-upload-streams instance.

Returns

A promise that resolves to a list of all completed uploads metadata. See completeUpload documentation for more information.

s3Uploader.abortAll()

Abort all uploads started by this s3-upload-streams instance.

Returns

A promise that resolves to a list of all abortUpload responses.

Advanced Usage

The following methods are designed in the case when a single S3Object is uploaded by multiple uploaders.

s3Uploader.awsUploadId(id)

Returns a promise that resolves to aws-sdk s3 UploadId which can be used to manipulate S3 object directly if you have to.

Params

id - id of the current upload returned by startUpload method.

s3Uploader.awsParts(id)

Returns a list of object parts that have been uploaded through this s3Uploader. See an example usage here

Params

id - id of the current upload returned by startUpload method.

Limitations

  • S3 Multipart Upload upload supports parts smaller than 5MB(5242880 Bytes) except for the last part of the object, so the minimal value of partSize parameter is 5242880.
  • The library is written using ES6 class syntax and I don't have plans to port it in ES5.