npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

dynameh

v4.5.1

Published

DynamoDB on Node more easier

Downloads

240

Readme

Dynameh

DynamoDB on Node more easier

Dynameh makes the official DynamoDB JavaScript API easier to use. It makes it easier to:

Click above for API-specifics or read on for a general overview.

Installation

Dynameh is your typical NPM package.

npm install --save dynameh

aws-sdk is a peer dependency. There are no other runtime dependencies.

Usage

import * as dynameh from "dynameh";
// or
const dynameh = require("dynameh");

See the documentation for details on each module and its methods.

A Simple Example

This example is written using async/await which is available in TypeScript and ES2017.

import * as aws from "aws-sdk";
import * as dynameh from "dynameh";

// Initialize the DynamoDB client.
const dynamodb = new aws.DynamoDB({
    apiVersion: "2012-08-10",
    region: "us-west-1"
});

// Set up the table schema.
const tableSchema = {
    tableName: "motorcycles",
    partitionKeyField: "id",
    partitionKeyType: "string"
};

async function updateMotorcycleHorsePower(motorcycleId, bhp) {
    // Fetch the item from the database.
    const getRequest = dynameh.requestBuilder.buildGetInput(tableSchema, motorcycleId);
    const getResult = await dynamodb.getItem(getRequest).promise();
    let motorcycle = dynameh.responseUnwrapper.unwrapGetOutput(getResult);
    
    if (!motorcycle) {
        // Item not found, create it.
        motorcycle = {
            id: motorcycleId
        };
    }
    
    // Update the horse power stat.
    motorcycle.bhp = bhp;
    
    // Put the updated object in the database.
    const putRequest = dynameh.requestBuilder.buildPutInput(tableSchema, motorcycle);
    await dynamodb.putItem(putRequest).promise();
}

updateMotorcycleHorsePower("sv-650", 73.4);

TableSchema

The key to easy building of requests in Dynameh is the TableSchema. This simple object defines all the extra information Dynameh needs to build requests.

For a table called MyTable with a partition key id that is a number...

{
  "tableName": "MyTable",
  "partitionKeyField": "id",
  "partitionKeyType": "number"
}

For a table called MyAdvancedTable with a partition key id that is a string, a sort key date that is a number, and a version field version...

{
  "tableName": "MyAdvancedTable",
  "partitionKeyField": "id",
  "partitionKeyType": "string",
  "sortKeyField": "date",
  "sortKeyType": "number",
  "versionKeyField": "version"
}

Optimistic Locking

Optimistic locking is a strategy for preventing changes from clobbering each other. For example two processes read from the database, make unrelated changes, and then both write to the database but the second write overwrites the first (clobbers).

Enable optimistic locking by setting the versionKeyField on your TableSchema. In the second TableSchema example that field is version. The versionKeyField will be automatically incremented on the server side during a put request. If the value for versionKeyField sent does not match the current value in the database then the contents have changed since the last get and the optimistic lock has failed. In that case you should get the latest version from the database and replay the update against that.

import * as aws from "aws-sdk";
import * as dynameh from "dynameh";

// Initialize the DynamoDB client.
const dynamodb = new aws.DynamoDB({
    apiVersion: "2012-08-10",
    region: "us-west-1"
});

// Set up the table schema.
const tableSchema = {
    tableName: "motorcycles",
    partitionKeyField: "id",
    partitionKeyType: "string",
    versionKeyField: "version"
};

async function updateMotorcycleHorsePower(motorcycleId, bhp) {
    // Fetch the item from the database.
    const getRequest = dynameh.requestBuilder.buildGetInput(tableSchema, motorcycleId);
    const getResult = await dynamodb.getItem(getRequest).promise();
    let motorcycle = dynameh.responseUnwrapper.unwrapGetOutput(getResult);
    
    if (!motorcycle) {
        // Item not found, create it.
        // Note that we don't need to set the version on create.
        motorcycle = {
            id: motorcycleId
        };
    }
    
    // Update the horse power stat.
    motorcycle.bhp = bhp;
    
    // Put the updated object in the database.
    const putRequest = dynameh.requestBuilder.buildPutInput(tableSchema, motorcycle);
    try {
        await dynamodb.putItem(putRequest).promise();
    } catch (err) {
        if (err.code === "ConditionalCheckFailedException") {
            // If this is the error code then the optimistic locking has failed
            // and we should redo the update operation (done here with recursion).
            updateMotorcycleHorsePower(motorcycleId, bhp);
        } else {
            throw err;
        }
    }
}

updateMotorcycleHorsePower("sv-650", 73.4);

Query and Scan

Both query and scan are paginated operations. They may return a limited set of results and you must make repeated calls to get all the results. queryHelper and scanHelper provide utilities that go through the pagination for you.

queryAll() and scanAll() will collect all results in a single array. For small result sets this is easiest to work with but for large result sets this may require a lot of memory to store all results at once.

queryByCallback() and scanByCallback() take a callback parameter that processes each page of results. Processing one page at a time reduces the amount of memory required.

This example function scans the table for items and deletes them in batches:

async function deleteAllItems(dynamodbClient, tableSchema) {
    const scanInput = dynameh.requestBuilder.buildScanInput(tableSchema);
    let deleteCount = 0;
    await dynameh.scanHelper.scanByCallback(dynamodbClient, scanInput, async items => {
        const keysToDelete = objectSchema.sortKeyField ?
            items.map(item => [item[tableSchema.partitionKeyField], item[tableSchema.sortKeyField]]) :
            items.map(item => item[tableSchema.partitionKeyField]);

        const batchDeleteInput = dynameh.requestBuilder.buildBatchDeleteInput(tableSchema, keysToDelete);
        await dynameh.batchHelper.batchWriteAll(dynamodbClient, batchDeleteInput);
        return true;
    });
}

Conditions

Conditions can be added to a put or delete request to make the operation conditional.

One of the most useful conditions is that the item must not already exist (create but not update). This is done by asserting attribute_not_exists on the primary key. For example...

const tableSchema = {
    tableName: "Boats",
    partitionKeyField: "name",
    partitionKeyType: "string"
};

async function addNewBoat(boat) {
    const putRequest = dynameh.requestBuilder.buildPutInput(tableSchema, boat);
    dynameh.requestBuilder.addCondition(tableSchema, putRequest, {attribute: "name", operator: "attribute_not_exists"});
    
    try {
        await dynamodb.putItem(putRequest).promise();
    } catch (err) {
        if (err.code === "ConditionalCheckFailedException") {
            throw new Error("This boat already exists!");
        } else {
            throw err;
        }
    }
}

addNewBoat({
    name: "Boaty McBoatface",
    type: "submarine",
    autonomous: true,
    commissioned: 2016
});

The following conditions are available...

| condition | # of value parameters | description | |----------------------|-----------------------|-------------| | = | 1 | the attribute's value equals the supplied value | | <> | 1 | the attribute's value does not equal the supplied value | | < | 1 | the attribute's value is less than the supplied value | | <= | 1 | the attribute's value is less than or equal to the supplied value | | > | 1 | the attribute's value is greater than the supplied value | | >= | 1 | the attribute's value is greater than or equal to the supplied value | | BETWEEN | 2 | the attribute's value is between the supplied values | | IN | at least 1 | the attribute's value is in the list of supplied values | | attribute_exists | 0 | the attribute has a value | | attribute_not_exists | 0 | the attribute does not have a value | | attribute_type | 1 | the attribute's value is of the supplied type (S, SS, N, NS, B, BS, BOOL, NULL, L, M) | | begins_with | 1 | the attribute's value begins with the supplied value | | contains | 1 | the attribute's value is a string that contains the supplied substring or a set that contains the supplied element |

See the official documentation for more info.

Filters

Filters can be added to query and scan operations to refine what objects are returned to you. Query conditions operate on indexed fields (partition and sort) and filters operate on non-indexed fields. Because filters are for non-indexed fields each object must be read from the database, consuming read capacity. This makes them slower and more expensive than a well-chosen index. A filter may still be a good idea to save bandwidth on an infrequent operation.

The filter object and the operators available are the same as conditions above.

This example scans for boats commissioned exactly 20 years ago for an anniversary announcement (an infrequent operation).

const tableSchema = {
    tableName: "Boats",
    partitionKeyField: "name",
    partitionKeyType: "string"
};

async function getVigentennialBoats() {
    const anniversaryYear = new Date().getFullYear() - 20;
    const scanRequest = dynameh.requestBuilder.buildScanInput(tableSchema);
    dynameh.requestBuilder.addFilter(tableSchema, scanRequest, {attribute: "year", operator: "=", values: [anniversaryYear]});
    
    return await dynameh.scanHelper.scanAll(dynamodb, scanRequest);
}

Projections

Projections can be added to a get, batch get or query request to control what attributes are returned. This saves bandwidth on the request.

For example...

const tableSchema = {
    tableName: "Transactions",
    partitionKeyField: "customerId",
    partitionKeyType: "string",
    sortKeyField: "transactionId",
    sortKeyType: "string",
};

async function getTransactionDates(customerId) {
    const queryRequest = dynameh.requestBuilder.buildQueryInput(tableSchema, customerId);
    const projectedQueryRequest = dynameh.requestBuilder.addProjection(tableSchema, queryRequest, ["date"]);
    // Note that addProjection() does not change the original object.
    // queryRequest != projectedQueryRequest
    
    const queryResponse = await dynamodb.query(projectedQueryRequest).promise();
    const transactions = dynameh.responseUnwrapper.unwrapQueryOutput(queryResponse);
    return transactions.map(t => t.date);
}

getTransactionDates("BusinessFactory");

Date Serialization

Date serialization can be configured by setting dateSerializationFunction on your TableSchema. It's a function that takes in a Date and returns a string or number. By default Dates are serialized as ISO-8601 strings (the only correct date format).

For example...

const tableSchema = {
    tableName: "MyTable",
    partitionKeyField: "id",
    partitionKeyType: "number",
    dateSerializationFunction: date => date.toISOString()   // same as default
};