npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

node_storage_manager

v1.3.2

Published

Node - Storage Pipe Manager allows world-wide storage and retrieval of any amount of data at any time. You can use Google Cloud Storage, AWS S3 Bucket for a range of scenarios including serving website content, storing data for archival and disaster recov

Downloads

49

Readme

Node - Storage Pipe Manager

npm package

Run on Repl.it Node.js CI Node.js CI Node.js CI

Node.js idiomatic client for [Cloud Storage]

Node - Storage Pipe Manager allows world-wide storage and retrieval of any amount of data at any time. You can use Google Cloud Storage, AWS S3 Bucket for a range of scenarios including serving website content, storing data for archival and disaster recovery, or distributing large data objects to users via direct download. Storage Pipe Manager is a Pipe Factory that allow you easily switch between Google CLoud AWS S3, CLOUDINARY and FS without breaking anything or any extra configurations

Table of contents:

Quickstart

Before you begin

Make Sure to define your credentials using enviromental vairable for AWS, Google and FS in your .zshrc or .bashrc file

Google Bucket Declaration

export GOOGLE_APPLICATION_CREDENTIALS=/Users/nitrocode/comics-eagle-39849343-837483483.json

AWS S3 Declaration

export AWS_ACCESS_KEY_ID=284893748923yuwfhsdkfjshkfjh
export AWS_SECRET_ACCESS_KEY=982u289432u48jsdfkjsr3894
export AWS_SESSION_TOKEN (optional)

Cloudinary Declaration

export CLOUDINARY_URL=cloudinary://4737858435783453:3827489jksdhfjasfhjBB@nitrocode

Digital Ocean Spaces

export DG_ACCESS_KEY=284893748923yuwfhsdkfjshkfjh
export DG_SECRET_KEY=982u289432u48jsdfkjsr3894

Local NFS Declaration

export MOUNT_POINT=/Users/nitrocode/bucket/

Would advice to declare all at once for easy switch between clients

Installing the client library

npm i node_storage_manager

Using the client library on GCLOUD

node_storage_manager allows you to switch between clients easily without reconfigurations

  // Imports the node_storage_manager library
  const Storage = require('node_storage_manager');

  // Set Storage Instance between AWS,GCLOUD and FS  
  let StorageInstance =  Storage.getInstance('GCLOUD');

  /**
   * TODO(developer): Uncomment these variables before running the sample.
   */
   // let bucketName = 'bucket-name';

    async function download(bucketName) {
      // Creates the new bucket
     await StorageInstance.download(bucketName, 'file', 'destination');
      console.log(`file downloaded`);
    }

Using the client library on AWS

  // Imports the node_storage_manager library
  const Storage = require('node_storage_manager');

  // Set Storage Instance between AWS,GCLOUD and FS  
  let StorageInstance =  Storage.getInstance('AWS');

  /**
   * TODO(developer): Uncomment these variables before running the sample.
   */
  // let bucketName = 'bucket-name';

  async function download(bucketName) {
    // Creates the new bucket
   await StorageInstance.download(bucketName, 'file', 'destination');
      console.log(`file downloaded`);
  }

Using the client library on CLOUDINARY

  // Imports the node_storage_manager library
  const Storage = require('node_storage_manager');

  // Set Storage Instance between AWS,GCLOUD and FS  
  let StorageInstance =  Storage.getInstance('CLOUDINARY');

  /**
   * TODO(developer): Uncomment these variables before running the sample.
   */
  // let bucketName = 'bucket-name';

  async function download(bucketName) {
    // Creates the new bucket
  let result = await StorageInstance.upload(bucketName, 'filepath', 'image or video');
      console.log(result);
  // This way you can get all data returned from Cloudinary Client e.g result.url e.t.c
  }

Using the client library on DigitalOcean Spaces

  // Imports the node_storage_manager library
  const Storage = require('node_storage_manager');

  // Set Storage Instance between AWS,GCLOUD and FS  
  let StorageInstance =  Storage.getInstance('DG', "Region e.g Asia");

  /**
   * TODO(developer): Uncomment these variables before running the sample.
   */
  // let bucketName = 'bucket-name';

  async function download(bucketName) {
    // Creates the new bucket
  let result = await StorageInstance.upload(bucketName, 'filepath', 'image or video');
      console.log(result);
  // This way you can get all data returned from Cloudinary Client e.g result.url e.t.c
  }

Using the client library on NFS

  // Imports the node_storage_manager library
  const Storage = require('node_storage_manager');

  // Set Storage Instance between AWS,GCLOUD and FS  
  let StorageInstance =  Storage.getInstance('NFS');

  /**
   * TODO(developer): Uncomment these variables before running the sample.
   */
  // let bucketName = 'bucket-name';

  async function download(bucketName) {
    // Creates the new bucket
   await StorageInstance.download(bucketName, 'file to download', 'destination e.g /Users/nitrocode/tmp/');
      console.log(`file downloaded`);
  }

Integration Samples

## API Documentation

StorageInstance functions

This contains a reference to the storage-pipe module. It is a valid use case to use both this module and all it's functions

Note to specify region on S3 and DigitalOcean Spaces you need to pass it parameter on getInstance

  // Imports the node_storage_manager library
  const Storage = require('node_storage_manager');
  let StorageInstance =  Storage.getInstance('AWS'or 'DG', 'Asia');
  StorageInstance.upload()
  }

StorageInstance.download()

Download file from S3, AWS & NFS using storage pipe

parameters:

  • bucketName - required, S3 bucket name to download files from.
  • filename - required, file to download from bucket
  • destination - required, where to put the file when done downloading

StorageInstance.upload()

Uploads file to S3, AWS & NFS using storage pipe

parameters:

  • bucketName - required, S3 bucket name to upload files to.
  • filename - required, file to up to bucket
  • destination - optional, for renaming file during upload i.e if file bob.jpg is beign uploaded setting destination on upload method will use destination value to rename the file

parameters required if on CLOUDINARY Instance:

  • bucketName - required, S3 bucket name to upload files to.
  • filename - required, file to up to bucket
  • fileType - required, Type of file to upload e.g image, video

StorageInstance.createBucket()

Create's Bucket in S3, AWS & NFS using sotrage pipe

parameters required if on S3 Instance:

  • bucketName - required, Bucketname to Create.
  • ACL - required, Define which AWS accounts or groups are granted access and the type of access. e.g public-read

parameters required if on GCLOUD Instance:

  • bucketName - required, Bucketname to Create.
  • location - required, Define specific region e.g ASIA
  • storageClass - optional, e.g coldline default storage or Leave the second argument blank for default settings.

parameters required if on NFS Instance:

  • bucketName - required, Bucketname to Create.

StorageInstance.deleteBucket()

Delete Bucket in S3, AWS & NFS using storage pipe

parameters required:

  • bucketName - required, Bucketname to Delete.

StorageInstance.listBuckets()

List Buckets in S3, AWS & NFS using storage pipe

parameters required:

  • None - No parameters Required

StorageInstance.listFiles()

List files in Bucket on S3, AWS & NFS using storage pipe

parameters required:

  • bucketName - required, Bucketname to list files from.

StorageInstance.deleteFile()

Delete file in Bucket on S3, AWS & NFS using storage pipe

parameters required:

  • bucketName - required, Bucketname to delete file from.
  • filename - required, filename to delete

StorageInstance.getBucketMetadata()

Get metadata from GCLOUD Storage Note these is only applicable to GCLOUD instance alone

parameters required:

  • bucketName - required, Bucketname to fetch it's metadata

Versioning

This library follows Semantic Versioning.

This library is considered to be General Availability (GA). This means it is stable; the code surface will not change in backwards-incompatible ways unless absolutely necessary (e.g. because of critical security issues) or with an extensive deprecation period. Issues and requests against GA libraries are addressed with the highest priority.

Contributing

Contributions welcome! See the Contributing Guide.

License

Apache Version 2.0

See LICENSE