npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

abcdef-rishant

v1.2.1

Published

Uploads compiled assets to s3 after build

Downloads

2

Readme

S3 Plugin

Travis Badge Code Climate

This plugin will upload all built assets to s3

Install Instructions

$ npm i webpack-s3-plugin

Note: This plugin needs NodeJS > 0.12.0

Usage Instructions

I notice a lot of people are setting the directory option when the files are part of their build. Please don't set directory if you're uploading your build. Using the directory option reads the files after compilation to upload instead of from the build process.

You can also use a credentials file from AWS. To set the profile set your s3 options to the following:

s3Options: {
  credentials: new AWS.SharedIniFileCredentials({profile: 'PROFILE_NAME'})
}

s3UploadOptions default to ACL: 'public-read' so you may need to override if you have other needs. See #28

Require webpack-s3-plugin
var S3Plugin = require('webpack-s3-plugin')
With exclude
var config = {
  plugins: [
    new S3Plugin({
      // Exclude uploading of html
      exclude: /.*\.html$/,
      // s3Options are required
      s3Options: {
        accessKeyId: process.env.AWS_ACCESS_KEY_ID,
        secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
        region: 'us-west-1'
      },
      s3UploadOptions: {
        Bucket: 'MyBucket'
      },
      cdnizerOptions: {
        defaultCDNBase: 'http://asdf.ca'
      }
    })
  ]
}
With include
var config = {
  plugins: [
    new S3Plugin({
      // Only upload css and js
      include: /.*\.(css|js)/,
      // s3Options are required
      s3Options: {
        accessKeyId: process.env.AWS_ACCESS_KEY_ID,
        secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
      },
      s3UploadOptions: {
        Bucket: 'MyBucket'
      }
    })
  ]
}
Advanced include and exclude rules

include and exclude rules behave similarly to Webpack's loader options. In addition to a RegExp you can pass a function which will be called with the path as its first argument. Returning a truthy value will match the rule. You can also pass an Array of rules, all of which must pass for the file to be included or excluded.

import isGitIgnored from 'is-gitignored'

// Up to you how to handle this
var isPathOkToUpload = function(path) {
  return require('my-projects-publishing-rules').checkFile(path)
}

var config = {
  plugins: [
    new S3Plugin({
      // Only upload css and js and only the paths that our rules database allows
      include: [
        /.*\.(css|js)/,
        function(path) { isPathOkToUpload(path) }
      ],

      // function to check if the path is gitignored
      exclude: isGitIgnored,

      // s3Options are required
      s3Options: {
        accessKeyId: process.env.AWS_ACCESS_KEY_ID,
        secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
      },
      s3UploadOptions: {
        Bucket: 'MyBucket'
      }
    })
  ]
}
With basePathTransform
import gitsha from 'gitsha'

var addSha = function() {
  return new Promise(function(resolve, reject) {
    gitsha(__dirname, function(error, output) {
      if(error)
        reject(error)
      else
       // resolve to first 5 characters of sha
       resolve(output.slice(0, 5))
    })
  })
}

var config = {
  plugins: [
    new S3Plugin({
      s3Options: {
        accessKeyId: process.env.AWS_ACCESS_KEY_ID,
        secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
      },
      s3UploadOptions: {
        Bucket: 'MyBucket'
      },
      basePathTransform: addSha
    })
  ]
}


// Will output to /${mySha}/${fileName}
With CloudFront invalidation
var config = {
  plugins: [
    new S3Plugin({
      s3Options: {
        accessKeyId: process.env.AWS_ACCESS_KEY_ID,
        secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
        sessionToken: 'a234jasd'  // (optional) AWS session token for signing requests
      },
      s3UploadOptions: {
        Bucket: 'MyBucket'
      },
      cloudfrontInvalidateOptions: {
        DistributionId: process.env.CLOUDFRONT_DISTRIBUTION_ID,
        Items: ["/*"]
      }
    })
  ]
}
With Dynamic Upload Options
var config = {
  plugins: [
    new S3Plugin({
      s3Options: {
        accessKeyId: process.env.AWS_ACCESS_KEY_ID,
        secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
      },
      s3UploadOptions: {
        Bucket: 'MyBucket',
        ContentEncoding(fileName) {
          if (/\.gz/.test(fileName))
            return 'gzip'
        },

        ContentType(fileName) {
          if (/\.js/.test(fileName))
            return 'application/javascript'
          else
            return 'text/plain'
        }
      }
    })
  ]
}

Options

  • exclude: A Pattern to match for excluded content. Behaves similarly to webpack's loader configuration.
  • include: A Pattern to match for included content. Behaves the same as exclude.
  • s3Options: Provide keys for upload options of s3Config
  • s3UploadOptions: Provide upload options putObject
  • basePath: Provide the namespace of uploaded files on S3
  • directory: Provide a directory to upload (if not supplied, will upload js/css from compilation)
  • htmlFiles: Html files to cdnize (defaults to all in output directory)
  • cdnizerCss: Config for css cdnizer check below
  • noCdnizer: Disable cdnizer (defaults to true if no cdnizerOptions passed)
  • cdnizerOptions: options to pass to cdnizer
  • basePathTransform: transform the base path to add a folder name. Can return a promise or a string
  • progress: Enable progress bar (defaults true)
  • priority: priority order to your files as regex array. The ones not matched by regex are uploaded first. This rule becomes useful when avoiding s3 eventual consistency issues

Contributing

All contributions are welcome. Please make a pull request and make sure things still pass after running npm run test For tests you will need to either have the environment variables set or setup a .env file. There's a .env.sample so you can cp .env.sample .env and fill it in. Make sure to add any new environment variables.

Commands to be aware of

WARNING: The test suit generates random files for certain checks. Ensure you delete files leftover on your Bucket.
  • npm run test - Run test suit (You must have the .env file setup)
  • npm run build - Run build

Thanks

  • Thanks to @Omer for fixing credentials from ~/.aws/credentials
  • Thanks to @lostjimmy for pointing out path.sep for Windows compatibility