npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@imrandil/aws-s3-stream-uploader

v1.0.0

Published

A package to upload data streams to AWS S3 and generate signed URLs.

Downloads

8

Readme

AWS S3 Stream Uploader

Overview

This package provides a convenient way to upload data streams, such as files, to Amazon S3 (Simple Storage Service) and generate signed URLs for accessing the uploaded content. It's particularly useful for scenarios where you need to upload large files or streams to S3 and share them securely with others.

Installation

You can install this package via npm:

npm install @imrandil/aws-s3-stream-uploader

Usage

To use this package, follow these simple steps:

  1. Import the package:
const { uploadStreamToS3, getObjectURL, getURLAfterUploadToS3 } = require('@imrandil/aws-s3-stream-uploader');
  1. Configure your AWS S3 credentials:
const s3Config = {
    region: 'your-region',
    credentials: {
        accessKeyId: 'your-access-key-id',
        secretAccessKey: 'your-secret-access-key'
    }
};
  1. Upload a stream to S3:
const stream = /* Your data stream */;
const bucket = 'your-bucket-name';
const key = 'your-file-key';
const expiresIn = 900; // Expiry time in seconds
const contentType = 'application/pdf'; // Set the content type dynamically
try {
    await uploadStreamToS3(s3Config, bucket, key, stream, contentType);
    console.log('Stream uploaded successfully');
} catch (error) {
    console.error('Error uploading stream to S3:', error);
}
  1. Generate a signed URL for the uploaded content:
const signedURL = await getObjectURL(s3Config, bucket, key, expiresIn);
console.log('Signed URL:', signedURL);
  1. Or, combine both steps to get the signed URL after uploading:
try {
    const contentType = 'application/pdf'; // Set the content type dynamically
    const { success, signedURL } = await getURLAfterUploadToS3(s3Config, bucket, key, stream, expiresIn, contentType);
    console.log('Signed URL:', signedURL);
} catch (error) {
    console.error('Error in getURLAfterUploadToS3:', error);
}

API Reference

uploadStreamToS3(s3Config, bucket, key, stream, contentType)

Uploads a stream to the specified S3 bucket.

  • s3Config: The S3 configuration object containing region and credentials.
  • bucket: The name of the S3 bucket.
  • key: The key (path) under which to store the data in the bucket.
  • stream: The data stream to upload.
  • contentType: The MIME type of the content being uploaded.

getObjectURL(s3Config, bucket, key, expiresIn)

Generates a signed URL for accessing the specified object in the S3 bucket.

  • s3Config: The S3 configuration object containing region and credentials.
  • bucket: The name of the S3 bucket.
  • key: The key (path) of the object in the bucket.
  • expiresIn: The expiry time for the signed URL in seconds.

getURLAfterUploadToS3(s3Config, bucket, key, stream, expiresIn, contentType)

Uploads a stream to S3 and generates a signed URL for accessing the uploaded content.

  • s3Config: The S3 configuration object containing region and credentials.
  • bucket: The name of the S3 bucket.
  • key: The key (path) under which to store the data in the bucket.
  • stream: The data stream to upload.
  • expiresIn: The expiry time for the signed URL in seconds.
  • contentType: The MIME type of the content being uploaded.

Notes

  • This package leverages the official AWS SDK for JavaScript to interact with S3.
  • Ensure that your AWS credentials have sufficient permissions to perform the necessary operations on S3.

Feedback

If you have any feedback or suggestions for improvements, feel free to open an issue on the GitHub repository. Your input is highly appreciated!

License

This package is licensed under the MIT License. See the LICENSE file for details.