npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

smcloudstore

v0.2.1

Published

Multi-cloud object storage module, for AWS, Azure, Google Cloud, B2 and more

Downloads

441

Readme

SMCloudStore

SMCloudStore is a lightweight Node.js module that offers a simple API to interact with the object storage services of multiple cloud providers, including:

Features

  • Simple, unified API to interact with all object storage providers
  • Lightweight and flexible: each provider is published as a separate package, so installing SMCloudStore won't add SDKs for each vendor and thousands of dependencies to your projects
  • Optimized for working with streams when putting/retrieving objects

This code is licensed under the terms of the MIT license (see LICENSE.md).

SMCloudStore vs other packages

SMCloudStore is specifically focused on abstracting the differences between multiple object storage providers, as storage is the most commonly consumed API from cloud providers that isn't fully standardized. This doesn't aim to support for other services that cloud providers might offer, such as databases, VMs, etc.

  • By focusing only on storage, we can keep things simple. There are few APIs and they are clear and easy to use.
  • SMCloudStore is highly modular, and every provider comes with a separate NPM package. This means that you don't need to install the SDKs for every single cloud provider and all their dependencies if you plan to use only one or two of them.
  • We are focusing on a modern development experience. The codebase is primarily written in TypeScript adn transpiled to JavaScript with all the typings published. We are also adopting a stream-centric approach with methods that upload and download objects from storage providers.

Add to your project

To start, install the smcloudstore package:

npm install --save smcloudstore

SMCloudStore requires Node.js version 8.9.1 or higher. Some providers might require a newer version (currently, Backblaze B2 requires Node.js 10 or higher).

Modules for each cloud provider are available on separate packages, so you can choose which ones to include. You need to install at least one of the following packages to use SMCloudStore:

Nomenclature

Each cloud provider use different names for the same concept. In SMCloudStore, we're standardizing to the following nomenclature:

| SMCloudStore | AWS | Azure | Backblaze | Google Cloud | Minio | | --- | --- | --- | --- | --- | --- | | Object | Object | Blob | File | Object | Object | | Container | Bucket | Container | Bucket | Bucket | Bucket |

API Guide

Full API documentation is available on this project's GitHub page and in the /docs folder.

Each cloud storage provider is implemented in a class defined in one of the modules above. All providers inherit from the StorageProvider abstract class, which is in the @smcloudstore/core package.

All asynchronous methods in the examples below return Promises, which can be used as then-ables or with async/await in ES2016. The examples below show using async/await syntax, and assume that all async calls are included in async functions.

Initialization

The main way to initialize the library is to use the SMCloudStore.Create(provider, connection) factory method. Using the factory method is recommended because it supports loading all providers with a "pluggable API", by just specifying their identifier in the first argument.

// Require the package
const SMCloudStore = require('smcloudstore')

// Identifier of the provider
const provider = 'minio'

// Complete with the connection options for the provider
const connection = {
    // ...
}

// Return an instance of the cloud storage provider class
const storage = SMCloudStore.create(provider, connection)

The format of the connection argument varies by cloud provider. For more details, please refer to the README for each provider in the packages/ folder.

Alternatively, you can create an instance of each provider by initializating the provider's class directly and invoking the constructor(connection) method. For example, to create a new Azure Blob Storage provider:

// Require the package
const AzureProvider = require('@smcloudstore/azure-storage')

// Complete with the connection options for the provider
const connection = {
    // ...
}

// Initialize the provider object
const storage = new AzureProvider(connection)

storage.createContainer(container, [options])

Using storage.createContainer(container, [options]) you can create a new container on the cloud storage server. The options argument is a dictionary with various options, depending on the provider being used. The method returns a Promise that resolves with no value when the container has been created.

// Create a new container called "testcontainer"
await storage.createContainer('testcontainer')

// Some providers, like AWS S3, require specifying a region
await storage.createContainer('testcontainer', {region: 'us-east-1'})

storage.isContainer(container)

The method storage.isContainer(container) returns a Promise that resolves with a boolean indicating whether a container exists on the provider.

// Once the async method resolves, exists will contain true or false
const exists = await storage.isContainer('testcontainer')

storage.ensureContainer(container, [options])

storage.ensureContainer(container, [options]) is similar to storage.createContainer(), but it creates the container only if it doesn't already exist. The method returns a Promise that resolves with no value when the container has been created.

// Container "testcontainer" will be created only if it doesn't already exists
await storage.ensureContainer('testcontainer')

// Some providers, like AWS S3, require specifying a region
await storage.ensureContainer('testcontainer', {region: 'us-east-1'})

storage.listContainers()

The method storage.listContainers() returns a Promise that resolves with the list of names of the containers that the user owns on the storage server.

// List all containers the user owns
const containers = await storage.listContainers()
// Result is an array of strings, like: ['testcontainer', 'testcontainer2']

storage.deleteContainer(container)

The method storage.deleteContainer(container) deletes a container from the storage server. It returns a Promise that resolves with no value on success.

// Delete a container
await storage.deleteContainer('testcontainer')

storage.putObject(container, path, data, [options])

storage.putObject(container, path, data, [options]) is the method to put (upload) an object to the storage server.

Arguments are:

  • container: name of the destination container.
  • path: full path inside the container where to store the object.
  • data: the data to be uploaded. This could be a Readable Stream, or a string or Buffer containing the full data. Streams are preferred when dealing with larger amounts of data.
  • options: dictionary with options. Options are primarily provider-dependent, so please refer to the documentation for each provider for more information. The list below includes only the common ones
    • options.metadata: object containing custom metadata and properties. An important key in the metadata object is Content-Type, which sets the Content-Type header for the file. Some providers might have special treatment for other keys too.

The method returns a Promise that resolves with no value when the upload is complete.

// Upload a stream
const data = require('fs').createReadStream('someimage.jpg')
const options = {
    metadata: {
        'Content-Type': 'image/jpeg'
    }
}
await storage.putObject('testcontainer', 'directory/someimage.jpg', data, options)

// Upload the content of a string or Buffer
const data = 'Nel mezzo del cammin di nostra vita mi ritrovai per una selva oscura ché la diritta via era smarrita'
const options = {
    metadata: {
        'Content-Type': 'text/plain'
    }
}
await storage.putObject('testcontainer', 'directory/dante.txt', data, options)

storage.getObject(container, path)

storage.getObject(container, path) allows getting (downloading) an object from the storage server.

Arguments are:

  • container: name of the container with the object.
  • path: full path of the object to retrieve, inside the container.

The method returns a Promise that resolves with a Readable Stream.

// Retrieve a file
const stream = await storage.getObject('testcontainer', 'directory/someimage.jpg')

// The method returns a Readable Stream that can be processed as you wish
// For example, to write the stream to file:
stream.pipe(require('fs').createWriteStream('write/to/someimage.jpg'))

storage.getObjectAsBuffer(container, path)

storage.getObjectAsBuffer(container, path) behaves similarly to storage.getObject(), accepting the same arguments, but returns the data in a Buffer object loaded in memory.

// Retrieve a file as buffer
const buffer = await storage.getObjectAsBuffer('testcontainer', 'directory/someimage.jpg')

// Print the last 100 bytes from the Buffer
console.log(buffer.slice(-100))

storage.getObjectAsString(container, path)

storage.getObjectAsString(container, path) behaves similarly to storage.getObject(), accepting the same arguments, but returns the data as a utf8-encoded string.

// Retrieve a file as string
const string = await storage.getObjectAsString('testcontainer', 'textfile.txt')

// Print the string
console.log(string)

storage.listObjects(container, [prefix])

storage.listObjects(container, [prefix]) returns a list of all the objects on the storage server at the specified path. This method does not recursively walk into directories (real or virtual, separated by a slash character). If prefix is not specified, the method will list the root "folder".

The method returns a Promise that resolves with an array of objects of type ListItemObject, containing information for an object on the server, or ListItemPrefix, containing information for a prefix (folder).

// Retrieve the list of objects
const list = await storage.listObjects('testcontainer', '/')

// List is an array of elements representing objects or prefixes

// Prefixes have the structure
{
    prefix: '/path'
}

// Object have the structure
{
    path: '/path/to/file.jpg',
    size: 123, // Size in bytes
    lastModified: Date() // Date object
    // Some providers might return more data, such as contentType, contentMD5, contentSHA1, and creationTime. Please refer to the documentation for details.
}

storage.deleteObject(container, path)

The method storage.deleteObject(container, path) deletes an object from a container in the storage server. It returns a Promise that resolves with no value on success.

// Delete an object
await storage.deleteObject('testcontainer', 'path/to/file.jpg')

storage.presignedGetUrl(container, path, [ttl])

The method storage.presignedGetUrl(container, path, [ttl]) returns a Promise that resolves with a pre-signed URL that can be used to download an object using any client (e.g. a web browser).

The ttl parameter determines how long the signed URL will be valid for, in seconds from present time.

// Get the pre-signed URL
const url = await storage.presignedGetUrl('testcontainer', 'path/to/file.jpg', 60)
// Result is a full URL, like "https://storage.provider/path?token=..."

The URL can be used to retrieve the object. For example, using curl:

curl "https://storage.provider/path?token=..." > /path/to/destination/file

Note that some providers do not support this method (Backblaze B2), and will throw an exception every time it's called.

stor age.presignedPutUrl(container, path, [options], [ttl])

The method storage.presignedPutUrl(container, path, [options], [ttl]) returns a Promise that resolves with a pre-signed URL that can be used to upload an object to the server, using a PUT request from any client.

The options argument can be used by the provider to specify certain requirements for clients that will upload files. For example, it can be used to enforce a certain Content-Type header, etc. Its values are the same they are for the putObject method. Some providers ignore this value - please refer to the documentation for each provider for more details.

The ttl parameter determines how long the signed URL will be valid for, in seconds from present time.

// Get the pre-signed URL
const url = await storage.presignedPutUrl('testcontainer', 'path/to/file.jpg', {}, 60)
// Result is a full URL, like "https://storage.provider/path?token=..."

The URL can be used to upload an object. For example, using curl:

curl --request PUT --upload-file "/path/to/file" "https://storage.provider/path?token=..." 

Note that some providers do not support this method (Backblaze B2), and will throw an exception every time it's called. Other providers, like Azure Storage, require clients to specify certain options when sending the PUT request; please refer to each provider's documentation.

storage.client()

storage.client() is a getter that exposes the underlying client for the storage provider, allowing for interaction with the provider directly. This is useful for advanced scenarios, such as needing to invoke provider-specific methods that aren't available in SMCloudStore.

storage.provider()

storage.provider() returns the identifier (name) of the storage provider currently in use, for example generic-s3.