npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

azure-blob-storage-temp-fork

v1.0.4

Published

azure-blob-storage

Downloads

15

Readme

Build Status

azure-blob-storage

This library wraps an Azure Blob Storage container which stores objects in JSON format.

Creating an instance of DataContainer

DataContainer is a wrapper over Azure Blob Storage container which stores only objects in JSON format. All the objects that will be stored will be validated against the schema that is provided at the creation time of the container.

The constructor of the DataContainer takes the following options:

{
  // Azure connection details for use with SAS from auth.taskcluster.net
  account:           '...',                  // Azure storage account name
  container:         'AzureContainerName',   // Azure container name
  // TaskCluster credentials
  credentials: {
    clientId:        '...',                  // TaskCluster clientId
    accessToken:     '...',                  // TaskCluster accessToken
  },
  accessLevel:       'read-write',           // The access level of the container: read-only/read-write (optional)
  authBaseUrl:       '...',                  // baseUrl for auth (optional)
  schema:            '...',                  // JSON schema object

  // Max number of update blob request retries
  updateRetries:              10,
  // Multiplier for computation of retry delay: 2 ^ retry * delayFactor
  updateDelayFactor:          100,

  // Randomization factor added as:
  // delay = delay * random([1 - randomizationFactor; 1 + randomizationFactor])
  updateRandomizationFactor:  0.25,

  // Maximum retry delay in ms (defaults to 30 seconds)
  updateMaxDelay:             30 * 1000,
}

Using the options format provided above a shared-access-signature will be fetched from auth.taskcluster.net. To fetch the shared-access-signature the following scope is required:
auth:azure-blob:<level>:<account>/<container>

In case you have the Azure credentials, the options are:

{
   // Azure credentials
   credentials: {
     accountName: '...',         // Azure account name
     accountKey: '...',          // Azure account key
   }
}

DataContainer operations

  • ensureContainer() This method will ensure that the underlying Azure container actually exists. This is an idempotent operation, and is often called in service start-up. If you've used taskcluster-auth to get credentials rather than azure credentials, do not use this as taskcluster-auth has already ensured the container exists for you.
    await container.ensureContainer();
  • removeContainer() Deletes the underlying Azure container. This method will not work if you are authenticated with SAS. Note that when the container is deleted, a container with the same name cannot be created for at least 30 seconds.
    await container.removeContainer();
  • listBlobs(options) Returns a paginated list of blobs contained by the underlying container.
    let blob = await container.listBlobs({
      prefix: 'state',
      maxResults: 1000,
    });
  • scanDataBlockBlob(handler, options) Executes the provided function on each data block blob from the container, while handling pagination.
    let handler = async (blob) => {
        await blob.modify((content) => {
          content.version += 1;
        });
      };
    let options = {
      prefix: 'state',
    };
    await container.scanDataBlockBlob(handler, options);
  • createDataBlockBlob(options, content) Creates an instance of DataBlockBlob. Using this instance of blob, a JSON file can be stored in Azure storage. The content will be validated against the schema defined at the container level.
    let options = {
        name: 'state-blob',
        cacheContent: true,
    };
    let content = {
      value: 30,
    };
    let dataBlob = await container.createDataBlockBlob(options, content);
  • createAppendDataBlob(options, content) Creates an instance of AppendDataBlob. Each object appended must be in JSON format and must match the schema defined at container level. Updating and deleting the existing content is not supported.
    let options = {
        name: 'auth-log',
    };
    let content = {
      user: 'test',
    }; 
    let appendBlob = await container.createAppendDataBlob(options, content);
  • load(blobName, cacheContent) This method returns an instance of DataBlockBlob or AppendDataBlob that was previously created in Azure storage. It makes sense to set the cacheContent to true only for DataBlockBlob, because AppendDataBlob blobs do not keep the content in their instance. It will throw an error if the blob does not exist.
    let blob = await container.load(blob, false);
  • remove(blob, ignoreIfNotExists) Remove a blob from Azure storage without loading it. Set the ignoreIfNotExists to true to ignore the error that is thrown in case the blob does not exist. Returns true, if the blob was deleted. It makes sense to read the return value only if ignoreIfNotExists is set.
    await container.remove('state-blob', true);

DataBlocKBlob and AppendDataBlob

DataBlockBlob is a wrapper over an Azure block blob which stores a JSON data which is conform with schema defined at container level.

AppendDataBlob is a wrapper over an Azure append blob. This type is optimized for fast append operations and all writes happen at the end of the blob. Updating and deleting the existing content is not supported. This type of blob can be used for e.g. logging or auditing.

The constructor of the blob takes the following options:

{
   name:                '...',        // The name of the blob (required)
   container:           '...',        // An instance of DataContainer (required)
   contentEncoding:     '...',        // The content encoding of the blob
   contentLanguage:     '...',        // The content language of the blob
   cacheControl:        '...',        // The cache control of the blob
   contentDisposition:  '...',        // The content disposition of the blob
   cacheContent:        true|false,   // This can be set true in order to keep a reference of the blob content.
                                      // Default value is false
}

The options cacheContent can be set to true only for DataBlockBlob because, AppendDataBlob does not support the caching of its content.

DataBlockBlob operations

  • create(content) Creates the blob in Azure storage having the specified content which will be validated against container schema.
    let content = {
      value: 40,
    }
    let content = await dataBlob.create(content);
  • load() This method returns the content of the underlying blob. After the content is loaded, it is validated and also cached, if the cacheContent was set.
    let content = await dataBlob.load();
  • modify(modifier, options) This method modifies the content of the blob. The modifier is a function that will be called with a clone of the blob content as first argument and it should apply the changes to the instance of the object passed as argument.
    let modifier = (data) => {
      data.value = 'new value';
    };
    let options = {
      prefix: 'state',
    };
    await dataBlob.modify(modifier, options);

AppendDataBlob operations

  • create() Creates the blob in Azure storage without initial content.
    await logBlob.create();
  • append(content, options) Appends a JSON content that must be conform to container schema.
    let content = {
      user: 'test2',
    }
    await logBlob.append(content);
  • load() Load the content of the underlying blob.
    let content = await logBlob.load();