npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

express-gcs-uploader

v1.0.15

Published

This is a express upload plugin that will auto copy the upload data to Google Cloud Storage. And also have a download strategy for choice:

Downloads

7

Readme

Express GCS Uploader

This is a express upload plugin that will auto copy the upload data to Google Cloud Storage. And also have a download strategy for choice:

  • Download from local if local have the file
  • Download from GCS directly
  • Download from GCS and cache to local

Express GCS Uploader

Note: All the service is base on multr, if you want to add more into the option, you can reference to multr's option.

Installation

npm install express-gcs-uploader --save

Setup

Step1: setup the auth and related configuration

var gcsuplder = require('express-gcs-uploader');
gcsuplder.auth({
  rootdir: __dirname,
  upload_url: '/uploads',
  download_url: '/download',
  tmpFolder: '/tmp', 
  cdn_url: 'http://your.bucket.com.tw', //option: for gcs public read or something like that
  keep_filename: true, //option: use for keep the original file name in the remote
  cache: true, //option: will write to local everytime when read from gcs
  bucket: 'your.bucket.com.tw',
  projectId: 'your-project-id',
  keyFilename: '/path/to/your/key.json'
});

The configuration detail describe below:

  • rootdir (string): The project root path.
  • upload_url (string): The upload folder path.
  • download_url (string): The download url.
  • tmpFolder (string): The tmp folder for store the object.
  • cdn_url (string): The cache url, like CDN path or your cloud storage web site bucket path.
  • keep_filename (boolean): If you want to keep the filename, use true for rename the random filename to yours.
  • cache (boolean): If you want to cache the data to local that read from gcs, use true...
  • bucket: The cloud storage bucket name for store your data.
  • projectId: The project id that your json key create.
  • keyFilename: The json key path of a service account that create from GCP console.

Step2: Setup the download route

app.use('/downloads/:id', gcsuplder.downloadproxy);

In this case, the route like: http://localhost:3000/downloads/e13b6a98a0d50b5990123a83eb87f2a8.png will listen the resource get. And the ":id " will be the filename that we can get from our upload.

Cloud Storage Default ACL Setting

If you want use the "cdn_url" to let cloud storage web site bucket can be your CDN path. You should set the default acl to the bucket objects for the uploaded object to grant a default read permission. ( About the website bucket, please reference to the doc: https://cloud.google.com/storage/docs/website-configuration )

gsutil defacl ch  -p allUsers:R gs://your.bucket.com.tw

Test upload

Make a route for receive upload

router.post('/uploadtest', function(req, res, next) {
  res.end('done...');
});

Upload using curl:

curl -F "image=@/Users/peihsinsu/Pictures/pic2.png" http://localhost:3000/uploadtest -X POST

Upload using curl with subfolder:

curl -F "image=@/Users/peihsinsu/Pictures/pic2.png" http://localhost:3000/uploadtest -X POST -H 'subfolder:myfolder'

Upload using upload form:

<form method="post" action="/uploadtest" name="submit" enctype="multipart/form-data">
  <input type="file" name="fileField"><br /><br />
  <input type="submit" name="submit" value="Submit">
</form>