npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@fnet/files-to-aws-s3

v0.1.6

Published

This project provides a simple and straightforward utility for uploading files from a local directory to an Amazon S3 bucket. It is designed to facilitate the transfer of files that match a specific pattern, helping users manage and store their data in th

Downloads

12

Readme

@fnet/files-to-aws-s3

This project provides a simple and straightforward utility for uploading files from a local directory to an Amazon S3 bucket. It is designed to facilitate the transfer of files that match a specific pattern, helping users manage and store their data in the cloud with ease.

How It Works

The utility scans a specified directory for files that match a given pattern. It then uploads these files to a designated AWS S3 bucket. Users can specify the destination directory within the S3 bucket, and optionally run a "dry run" to preview uploads without actually executing them. This process is managed through AWS credentials and region configurations to ensure secure and accurate data handling.

Key Features

  • Pattern-Based File Selection: Specify a pattern to select files within a directory for upload.
  • Customizable Destination: Choose a destination directory within your S3 bucket.
  • Metadata Management: Attach custom metadata to each file being uploaded.
  • Dry Run Mode: Preview which files would be uploaded without executing the upload.
  • Verbose Output Option: Enable verbose mode to log details of the upload process.

Conclusion

This utility offers a modest and practical solution for users needing to transfer files to AWS S3. It is designed to streamline the process with its pattern matching, customizable settings, and secure handling through AWS.

Developer Guide for @fnet/files-to-aws-s3

Overview

The @fnet/files-to-aws-s3 library is designed to simplify the process of uploading files from your local system to an Amazon S3 bucket. By utilizing this library, developers can automate file uploads based on specified patterns, manage file destinations within S3, and define metadata for each uploaded file. The library is particularly useful for scenarios where files need to be programmatically transferred to S3, making it easier to handle batch uploads in a consistent and efficient manner.

Installation

You can install the library using either npm or yarn. Here are the installation commands:

# Using npm
npm install @fnet/files-to-aws-s3

# Using yarn
yarn add @fnet/files-to-aws-s3

Usage

To use the library, ensure that your AWS credentials (e.g., AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, and AWS_REGION) are properly set in your environment. Here’s a practical example of how to use this library to upload files to an S3 bucket:

import uploadFilesToS3 from '@fnet/files-to-aws-s3';

(async () => {
  try {
    const result = await uploadFilesToS3({
      bucket: 'your-s3-bucket-name',
      dir: '/path/to/local/files',
      pattern: '**/*.txt', // pattern to match files
      destDir: '/desired/destination/on/s3',
      region: 'us-east-1',
      verbose: true // enable to log each upload
    });

    console.log('Upload results:', result);
  } catch (error) {
    console.error('Error uploading files:', error);
  }
})();

Examples

Here are some typical use cases for this library:

Example 1: Basic File Upload

import uploadFilesToS3 from '@fnet/files-to-aws-s3';

uploadFilesToS3({
  bucket: 'example-bucket',
  pattern: '*.jpg',
  region: 'us-west-2'
}).then(result => {
  console.log('Files uploaded:', result.files);
}).catch(error => {
  console.error('Error:', error);
});

Example 2: Upload with Metadata and Dry Run

import uploadFilesToS3 from '@fnet/files-to-aws-s3';

uploadFilesToS3({
  bucket: 'example-bucket',
  pattern: '*.png',
  metadata: { author: 'your-name' },
  dryRun: true, // simulate upload without actually uploading
  region: 'eu-central-1'
}).then(result => {
  console.log('Simulated upload result:', result.files);
}).catch(error => {
  console.error('Error:', error);
});

Acknowledgement

This library leverages @aws-sdk/client-s3 for managing S3 interactions and @fnet/list-files for file pattern matching, seamlessly integrating these tools to provide a streamlined file uploading experience to AWS S3.

By following this guide, you should be equipped to efficiently use @fnet/files-to-aws-s3 for your file uploading needs to AWS S3 with ease.

Input Schema

$schema: https://json-schema.org/draft/2020-12/schema
type: object
properties:
  bucket:
    type: string
    description: The name of the S3 bucket to which files will be uploaded.
  dir:
    type: string
    description: The directory from where files will be listed.
    default: current working directory
  pattern:
    type: string
    description: The pattern used to match files for uploading.
  destDir:
    type: string
    description: The S3 directory path where files will be uploaded.
    default: /
  dryRun:
    type: boolean
    description: If true, no files will actually be uploaded.
    default: false
  metadata:
    type: object
    additionalProperties: true
    description: Additional metadata to include with each S3 object.
  region:
    type: string
    description: The AWS region where the S3 bucket is located.
  verbose:
    type: boolean
    description: If true, logs upload information.
    default: false
required:
  - bucket
  - pattern