npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

stream-to-s3

v2.0.0

Published

Stream any file to S3 painlessly

Downloads

19

Readme

Stream to S3

A quick script to stream large files to Amazon S3 using node.js

Build Status Code Climate codecov.io Dependencies devDependency Status

Node.js is the perfect tool for streaming large files to Amazon Web Services S3.

Node.js and Amazon S3

Use Cases

  • Rolling your own cloud backup service
  • Uploading files/photos from a mobile app or website
  • Drop Box Clone using S3 as backend

Usage

Install from NPM:

npm install stream-to-s3 --save
var streamToS3 = require('stream-to-s3');
var file = '/your-file-name.jpg';      // any file format!
streamToS3(file, function(err, url) { // url is the url of the file on S3
  console.log(file, ' Was uploaded to S3. Visit:', url);
});

Require Environment Variables

stream-to-s3 uses environment variables for Amazon WebServices Secret Keys
(to help people avoid hard-coding passwords in your code...)

If you're new to Environment Variables check out our complete beginners guide: https://github.com/dwyl/learn-environment-variables

you will need to set the following environment variables:

AWSAccessKeyId=ReplaceWithYourActualKey
AWSSecretKey=DownloadThisFromYourAWSConsole
S3BUCKET=YourS3BucketName
AWSREGION=eu-west-1
ACL=public-read

While you are developing your app, we recommend managing your environment variables using env2
this will allow you to use a file to keep your AWS/S3 keys which you can easily share with your co-developers and still exculde from GitHub (by listing it in your .gitignore file)

To help you get started we have created a sample config.env file to use it, simply copy it to your working directory:

cp node_modules/stream-to-s3/config.env_example ./config.env && echo 'config.env' >> .gitignore

Then download your S3 keys from your AWS Console and set both keys and S3 bucket in your config.env file.

Next load your environment variables using env2

npm install env2 --save
require('env2')('config.env');         // load S3 Keys from config.env
var streamToS3 = require('stream-to-s3');
var file = '/your-file-name.jpg';      // any file format!
streamToS3(file, function(err) { // standard callback function:
  console.log(file, ' Was uploaded. Visit:', url);
});

tl;dr

Note: we have deliberately kept the stream-uploader simple, if you need to transform the data in the read-stream before uploading it, fork this repo, add a new test/method and submit a PR.

The Solution

See: index.js for the implementation details.

Useful Links

Node.js Docs for Readable Streams:

  • http://nodejs.org/api/stream.html#stream_class_stream_readable

Best place to learn Node.js Streams:

  • SubStack's Streams Intro: github.com/substack/stream-handbook
  • Max's stream intro: http://maxogden.com/node-streams.html
  • http://ejohn.org/blog/node-js-stream-playground
  • http://codewinds.com/blog/2013-08-04-nodejs-readable-streams.html

Background

"It Cannot Be Done" (Challenge Accepted!)

According to this StackOverflow Q/A stream-uploading large files to S3 is "not possible" because S3 requires the file-size up-front. (i.e. before you can upload the file you need to tell S3 its size and since we are streaming the file in chunks we don't know its size before hand... or so the answer suggests.)

To update

  • [ ] http://stackoverflow.com/questions/25156716/how-to-apply-async-on-for-loop-of-range
  • [ ] http://stackoverflow.com/questions/17309559/stream-uploading-file-to-s3-on-node-js-using-formidable-and-knox-or-aws-sdk (requires using the HTML5 File API to get the File Size/Mime client-side...)