npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

dynamodb-datamodel

v0.2.7

Published

DynamoDB single table design based model

Downloads

80

Readme

DynamoDB DataModel

Actions Status npm version minzip bundle size dependencies Status DependABot Status CodeCov DeepScan grade npm type definitions npm license

NOTE: This project is in BETA (not yet 1.0.0) and is constantly being updated. Feedback and bugs are very much welcomed issues/feedback. I am current working to integrate this back into my personal project to fully validate all aspects of this library once that is finished and all issues I find are fixed I'll update the library version to a 1.0.0 stable version.

The DynamoDB DataModel is a javascript and typescript library to simplify working with single table designs on Amazon DynamoDB. This library builds off of the aw-sdk DocumentClient and makes it easy to work with many different item types and access patterns in a single DynamoDB table with secondary indexes.

If you're wondering what single table design is and why you should use it then I recommend you check out the Resources section below. Then come back and check out how DynamoDB-DataModel can help.

Why

The aws-sdk DocumentClient is a great library that allows applications to use native JavaScript types when reading and writing data to a DynamoDB table. But as you work more with single table designs you'll find yourself frequently mapping your application data to secondary index key attributes to support your various access patterns. Additionally, the DocumentClient provides very little help building expressions for key condition, write condition, filter and update.

This is where DynamoDB-DataModel comes into play, by making it easy to map the data between your application models and how it stored in the DynamoDB table to leverage single table design and support all of your applications access patterns. It also makes it easy to build expressions for key condition, write condition, filter and update, even using your model properties names.

Problems this library focuses on

  • Bidirectional data mapping - In single table design the primary key attributes for tables and indexes are named in a generic way to allow each item type to use the attributes for different properties, to allow multiple access patterns with limited secondary indexes.
  • Table item update - Even simple updates to a table item are not easy, especially in the context of data mappings.
  • Table condition and filter expression - DynamoDB supports conditional writes or filtered queries/scans, but building the ConditionExpression/FilterExpression can be tricky, especially in the context of data mappings.
  • Focus on NoSQL capabilities - Make it easy to use DynamoDB's unique capabilities, and avoid SQL concepts that don't make sense in the NoSQL world (don't be an ORM).
  • Simple to use and open for extension - Though this library tries to capture the current capabilities of DynamoDB with a simple API surface, it also exposes ways to extend is capabilities to support more complex use cases or if AWS adds additional update, filter or condition expression features.
  • Good TypeScript support - Not all libraries have good, up to date and well documented typescript support.
  • Small and light weight - The main use case is to run in AWS Lambda so it needs to be small, light weight and have little to no dependencies.

Non goals

  • Not a validator or coercer - There are already many very good open source data validators and coercer, like joi, yup, ajv and others. This library does not attempt to do any validation or coercion, but it is highly recommended that you validate and coerce the data before writing it to DynamoDB using DynamoDB-DataModel.
  • Not a table manager - This library is just for reading and writing data to DynamoDB, for most all single table usage the table will be managed through CloudFormation.
  • Not a Object-Relational Mapping (ORM) - DynamoDB-Datamodel doesn't use any SQL concepts. Many SQL concepts don't directly apply to NoSQL data bases like dynamodb, so this tool focuses on the unique capabilities of dynamodb. If you need an ORM, there are several existing libraries: @aws/dynamodb-data-mapper, @AwsPilot/dynamodb project or @BasePrime's dynamodb

Note: As I was factoring this library out of a personal project, I came across Jeremy Daly's dynamodb-toolbox the v0.1 version and played around with it, but it didn't have typescript support and didn't quite fit my personal project needs. I also felt that DynamoDB-DataModel had a unique and extensible take on building key condition, condition, filter and update expressions. And the ability to have custom fields for model schemes has made it easy to extend the libraries core capabilities.

Installation

Install the DynamoDB-DataModel with npm:

npm i dynamodb-datamodel

or yarn:

yarn add dynamodb-datamodel

Dependencies:

  • aws-sdk >= 2.585.0 (peerDependency)
  • node.js >= 8.0.0

Basic usage

Import or require Table, Model and Fields from dynamodb-datamodel:

import { Table, Model, Fields } from 'dynamodb-datamodel';

General usage flow:

  1. Import or require Table, Model and Fields from dynamodb-datamodel
  2. Create DynamoDB DocumentClient
  3. (TypeScript) Define Table's primary key interface
  4. Create Table and define key attributes and schema
  5. (TypeScript) Define each Model key and data interface
  6. Create each Model and define data schema
  7. Use the model to read and write data

Basic usage example

From: examples/Readme.BasicUsage.ts

import { DocumentClient } from 'aws-sdk/clients/dynamodb';
// 1. Import or require `Table`, `Model` and `Fields` from `dynamodb-datamodel`
import { Table, Model, Fields } from 'dynamodb-datamodel';

// 2. Create DynamoDB DocumentClient
const client = new DocumentClient({ convertEmptyValues: true });

// 3. (TypeScript) Define Table's primary key
interface TableKey {
  P: Table.PrimaryKey.PartitionString;
  S?: Table.PrimaryKey.SortString;
}

// 4. Create Table and define key attributes and schema
const table = Table.createTable<TableKey, TableKey>({
  client,
  name: 'SimpleTable',
  keyAttributes: {
    P: Table.PrimaryKey.StringType,
    S: Table.PrimaryKey.StringType,
  },
  keySchema: {
    P: Table.PrimaryKey.PartitionKeyType,
    S: Table.PrimaryKey.SortKeyType,
  },
});

// 5. (TypeScript) Define each Model key and data interface
interface ModelKey {
  id: string;
}
// Define model data that derives from the key
interface ModelItem extends ModelKey {
  name: string;
}

// 6. Create each Model and define data schema
const model = Model.createModel<ModelKey, ModelItem>({
  schema: {
    id: Fields.split({ aliases: ['P', 'S'] }),
    name: Fields.string(),
  },
  table: table as Table, // 'as Table' needed for TypeScript
});

// Additional models can also be defined

// 7. Use the model to read and write data
export async function handler(): Promise<void> {
  // Write item
  await model.put({ id: 'P-GUID.S-0', name: 'user name' });
  // Update item
  await model.update({ id: 'P-GUID.S-0', name: 'new user name' });
  // Get item
  await model.get({ id: 'P-GUID.S-0' });
  // Delete item
  await model.delete({ id: 'P-GUID.S-0' });
}

Components

DynamoDB-DataMode is composed of several components that can be used on their own and are used by the higher level components like Fields and Model.

DynamoDB-DataModel consists of four core components:

  • Table - The Table object is the first object you'll need to create and has a one-to-one corelation with a provisioned DynamoDB table. Table is essentially a wrapper around the DynamoDB DocumentClient and is used by the Model objects to read and write data to the table. Following a single table design you'll only need a single table object.
  • Model - The Model object is the secondary component you'll need to create and has a one-to-one corelation with each of the data item types you are storing in the DynamoDB table. You will create multiple Models, one for each data item type, and they each will reference the same table object. The Model object contains a schema that defines how the model data will be represented in the dynamodb table. Models are the main object you will be using to read and write data to the DynamoDB table.
  • Field - The Field objects are created when declaring the Model schema and each item data property on the model will be associated with a Field object. There are separate Field classes for each of the native DynamoDB data types (string, number, boolean, binary, null, list, map, string set, number set and binary set), in addition to more advanced fields (like composite, date, created date, type and others). You can also create custom fields for your own data types. Each fields main purpose is to map the data between model properties and table attributes (bidirectional), but they can also add update, filter and condition expressions to support more complex behavior. Since there are many types of fields they are all contained within the Fields namespace.
  • Index - Index objects are created along side the Table for each global or local secondary index that is associated with the DynamoDB table.

Expressions components:

  • Condition - Contains functions for building complex condition and filter expressions.
  • ConditionExpression - Object passed to the resolvers returned by the Condition functions.
  • ExpressionAttributes - Object that maps and saves expression attribute names and values to a placeholder, used by ConditionExpression, KeConditionExpression and UpdateExpression.
  • KeyCondition - Contains functions for building sort key conditions used in Table.query and Index.query methods.
  • KeyExpressionExpression - Object passed to the resolvers return by the KeyCondition functions.
  • Update - Contains functions for building complex update expressions used in Table.update and Module.update methods.
  • UpdateExpression - Object passed to the resolves return by the Update functions.

Table

Table is the first object that you will need to create when working with dynamodb-datamodel. It is the object that contains the configuration data for a provisioned DynamoDB Table and uses the DynamoDB DocumentObject to read and write to the DynamoDB Table.

You can either create a simple JavaScript based Table using new Table() or if you want to get additional typescript based type checking and code editor autocomplete you can use Table.createTable<KEY, ATTRIBUTES> to create a Table.

Creating a table is simple, there are only three things needed: 1) the name of the DynamoDB table, 2) a map of key attribute types, 3) a map of primary key types.

Table Example

From: examples/Table.Simple.ts

import { DocumentClient } from 'aws-sdk/clients/dynamodb';
import { Table } from 'dynamodb-datamodel';

export const client = new DocumentClient({ convertEmptyValues: true });

// Define the table primary key interface.
export interface TableKey {
  P: Table.PrimaryKey.PartitionString;
  S?: Table.PrimaryKey.SortString;
}

// Create the table object for the primary key and secondary indexes.
export const table = Table.createTable<TableKey>({
  client,
  name: 'SimpleTable',
  keyAttributes: {
    P: Table.PrimaryKey.StringType,
    S: Table.PrimaryKey.StringType,
  },
  keySchema: {
    P: Table.PrimaryKey.PartitionKeyType,
    S: Table.PrimaryKey.SortKeyType,
  },
});

// Generate params to pass to DocumentClient or call the action method
const params = table.getParams({ P: 'p1', S: 's1' });

Index

DynamoDB supports two types of secondary indexes: local secondary index (LSI) and global secondary index (GSI). Just like Table, an Index object has a one-to-one corelation with a provisioned secondary index. Like Model, Index uses Table to query and scan the secondary indexes of the DynamoDB Table.

Also like Table you can create either a JavaScript based index using new Index() or if you want additional typescript based type checking and code editor autocomplete you can use Index.createIndex<KEY> to create an Index.

Creating a index is simple, there are only three things needed: 1) the name of the secondary index, 2) a map of the primary key types, 3) the projection type.

Index Example

From: examples/Index.ts

import { Index, Table } from 'dynamodb-datamodel';
import { table } from './Table';

// Define a Global Secondary Index (GSI) key interface for GSI0.
export interface GSI0Key {
  G0P: Table.PrimaryKey.PartitionString;
  G0S?: Table.PrimaryKey.SortString;
}

// Create an Index object for GSI0 based on GSI0Key, and project all attributes.
export const gsi0 = Index.createIndex<GSI0Key>({
  name: 'GSI0',
  // Defines the key type ('HASH' or 'RANGE') for the GSI primary keys.
  keySchema: {
    G0P: Table.PrimaryKey.PartitionKeyType,
    G0S: Table.PrimaryKey.SortKeyType,
  },
  projection: { type: 'ALL' },
  table: table as Table,
  type: 'GLOBAL',
});

// Define a Local Secondary Index (LSI) key interface for LSI0, partition key must be same as the table's
export interface LSI0Key {
  P: Table.PrimaryKey.PartitionString;
  L0S?: Table.PrimaryKey.SortNumber;
}

// Create an Index object for LSI0 based on LSI0Key, and project all attributes.
export const lsi0 = Index.createIndex<LSI0Key>({
  name: 'LSI0',
  // Defines the key type ('HASH' or 'RANGE') for the LSI primary keys.
  keySchema: {
    P: Table.PrimaryKey.PartitionKeyType,
    L0S: Table.PrimaryKey.SortKeyType,
  },
  projection: { type: 'ALL' },
  table: table as Table,
  type: 'LOCAL',
});

Model

The Model object is the main object you'll be interacting with to read and write data to DynamoDB. There are 4 main methods that can be used to read and write data to the table, they are: get, delete, put and update. At the core these methods do three things:

  1. Transform the input model data into DynamoDB compatible table data using the schema fields.
  2. Call the associated Table methods to read and write the data to DynamoDB.
  3. Transform the output table data into model data..

These read and write methods allow DynamoDB parameters to get added to the input parameters just before calling the DocumentClient method, allowing you to override any defaults that the Model or Table object set.

A model will need to be created for each item type that is stored in DynamoDB. If your application has a lot of item types you'll probably want to create and cache each model on demand to avoid allocation a lot of objects when your code starts ups, especially in AWS Lambda since cold start times can have impact.

A model only needs three things when created:

  1. (typescript only) Key and Model type definition to support type checking when calling read/write methods.
  2. Table object to use when read and write to DynamoDB.
  3. The model schema based on the model type to support the bi-directional mapping between model and table data.

Model Example

From: examples/Model.ts (imports: examples/Table.ts)

import { Fields, Model, Table, Update } from 'dynamodb-datamodel';
import { table } from './Table';

// (TypeScript) Define model key and item interface.
export interface ModelKey {
  id: string;
}
// Use Update.* types to support type checking when using Model.update.
export interface ModelItem extends ModelKey {
  name: Update.String;
  age?: Update.Number;
  children?: Update.List<{ name: string; age: number }>;
  sports?: Update.StringSet;
}

// Define the schema using Fields
export const model = Model.createModel<ModelKey, ModelItem>({
  schema: {
    id: Fields.split({ aliases: ['P', 'S'] }),
    name: Fields.string(),
    age: Fields.number(),
    children: Fields.list(),
    sports: Fields.stringSet(),
  },
  table: table as Table,
});

// Generate params to pass to DocumentClient or call the action method
const params = model.getParams({ id: 'P-1.S-1' });

Fields

Fields encapsulate the logic that bi-directionally maps model property value(s) to the table attribute value(s) by implementing the Field interface contract.

A Field object needs to implement three function, and has an additional optional 4 function. The typescript Field interface is as follows:

export interface Field {
  init(name: string, model: Model): void;

  toModel(name: string, tableData: Table.AttributeValuesMap, modelData: Model.ModelData, context: ModelContext): void;

  toTable(name: string, modelData: Model.ModelData, tableData: Table.AttributeValuesMap, context: TableContext): void;

  toTableUpdate?(
    name: string,
    modelData: Model.ModelUpdate,
    tableData: Update.ResolverMap,
    context: TableContext,
  ): void;
}

The init method is called during the constructor of Model for each Field and is passed the name of the model property the Field is associated with along with the Model that the Field belongs to.

The toModel method is called after reading and writing to the DynamoDB, with four arguments. The implementation of this method will generally map the data from tableData to the modelData object. The name argument is the name of the model property the field is associated with, should be the same name as passed in init. The tableData argument contains the data that was returned from the table and will be mapped to the modelData. The modelData argument is the model data that this method is writing to. The context argument has additional contextual data used by the more extended field objects.

The toTable and toTableUpdate methods are called before reading and writing to the DynamoDB data, with four arguments. The implementation of this method will generally map data from the modelData to the tableData object, in some this is the opposite of toModel. Like both init and toModel the name argument is the name of the model property the field is associated with. modelData is the input model data that will be mapped to the tableData argument. The context argument is the same as toModel.

There are several built in Fields that provide a base set of capabilities for mapping data and provide some core logic.

Core Fields

Below are the fields that map to the native DynamoDB types and in most cases map to the native JavaScript types.

| Name | Table Type | Description | | :---------------------------------------------------------------------------------------------- | :--------- | :------------------------ | | string | S | JavaScript string | | number | N | JavaScript number | | binary | B | JavaScript binary | | boolean | BOOL | JavaScript boolean | | stringSet | SS | DocumentClient string set | | numberSet | NS | DocumentClient number set | | binarySet | BS | DocumentClient binary set | | list | L | JavaScript list | | map | M | JavaScript map |

Extended Fields

| name | Table Type | Description | | :-------------------------------------------------------------------------------------------------------- | :--------- | :------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | model | M | A map based field that contains a schema in the same format as Model to allow for nested types. | | modelList | L | List of single model type that follows a schema, to all for a list of nested types. | | modelMap | M | Similar to modelList except instead of a array of models this is a map of models with string based keys. | | date | N | Maps a JavaScript Date object for mapping a date to a table attribute number. | | hidden | - | Hide field from getting written to DynamoDB . | | split | S | Split the field property into multiple DynamoDB attributes. Commonly used as model id field. | | composite | S | Compose multiple model properties into a single table attribute to allow for more complex queries. | | compositeNamed | S | Similar to composite, but uses names instead of indexes to identify the slots. | | type | S | Writes the Model name to a table attribute. | | createdDate | N | Sets an attribute to the date the item was put or creaed. | | updatedDate | N | Updates an attribute each time the item is updated. | | revision | N | Increments a table attribute by one for each update. Can also prevent writes if input revision doesn't match. |

Create your own custom fields

Fields Examples

From: examples/Fields.ts

import { Fields, Model, Table, Update } from 'dynamodb-datamodel';
import { table } from './Table';

// (TypeScript) Define model key and item interface.
export interface ModelKey {
  id: string;
}
// Use Update types so model.update will have some type safety.
export interface ModelItem extends ModelKey {
  name: Update.String;
  age?: Update.Number;
  children?: Update.List<{ name: string; age: number }>;
  sports?: Update.StringSet;
}

// Define the schema using Fields
export const model = Model.createModel<ModelKey, ModelItem>({
  schema: {
    id: Fields.split({ aliases: ['P', 'S'] }),
    name: Fields.string(),
    age: Fields.number(),
    children: Fields.list(),
    sports: Fields.stringSet(),
  },
  table: table as Table,
});

// Generate params to pass to DocumentClient or call the action method
const params = model.updateParams({ id: 'P-1.S-1', age: Update.inc(1) });

// (jest) validate output of updateParams
expect(params).toEqual({
  ExpressionAttributeNames: { '#n0': 'age' },
  ExpressionAttributeValues: { ':v0': 1 },
  Key: { P: 'P-1', S: 'S-1' },
  TableName: 'ExampleTable',
  UpdateExpression: 'SET #n0 = #n0 + :v0',
});

Condition and Filter Expressions

The Condition component in DynamoDB-DataModel is a collection of functions that make building condition and filter expressions very easy and leverages typescript to ensure the correct types are used for each supported operation.

All of the condition functions return a resolver function that has the signature of (exp: Condition.Expression, type: 'BOOL') => string. This simple mechanism of returning a function enables two things:

  1. Allows multiple conditions to be composed together in a simple and recursive way, which can support deeply nested and and or conditions.
  2. Enables the expression to be resolved recursively as a single unit.

The use of a resolver function also allows for custom Condition functions to be written that provide higher level concepts.

Conditions are a lower level component and can be used on its own. It is also used by Table, Model and Fields.

To get more details on condition expression see the AWS Condition Expression Reference.

Condition functions

| Name | Supported Types | Description | | :--------------------------------------------------------------------------------------------------- | :--------------------- | :------------------------------------------------------------------------------------------------------------------------ | | path | all | Returns the the placeholder for an attribute path. | | size | B, BS, NS, S, SS, M, L | Gets the size of an attribute and can be used in the below compare functions to ensure a certain size | | compare | all | Compares the value of an attribute for a path against a value with a given operator: '=', '<>', '>', '>=', '<=', and '<'. | | eq | all | True if the value of an attribute for a path is equal to the passed in value. | | ne | all | True if the value of an attribute for a path is not equal to the passed in value. | | ge | all | True if the value of an attribute for a path is greater then or equal to the passed in value. | | gt | all | True if the value of an attribute for a path is greater then the passed in value. | | le | all | True if the value of an attribute for a path is less then or equal to the passed in value. | | lt | all | True if the value of an attribute for a path is less then the passed in value. | | between | all | True if the value of an attribute for a path is between two passed in value. | | beginsWith | S | True if the value of an attribute for a path begins with the passed in string value. | | contains | S, BS, NS, SS | True if the value of an attribute for a path contains the passed in value. | | in | all | True if the value of an attribute for a path equals on of the values in the passed in array. | | exists | all | True if the value of an attribute for a path exists in the table item. | | notExists | all | True if the value of an attribute for a path does not exists in the table item. | | type | all | True if the value of an attribute is the passed in type. | | and | conditions | True if the result of all conditions are also true. | | or | conditions | True if the results of any condition is true. | | not | conditions | True if the result of the condition is false. |

Create your own custom conditions

It is easy to create your own custom conditions. As was stated above all Condition functions just return a resolver function with the signature of (exp: Condition.Expression, type: 'BOOL') => string. So to create a custom condition you just need to have a function that returns a resolve function. In the implementation of the resolver function you can use the Condition.Expression object to add and get the placeholders for the attribute names and values in your custom expression then return the condition or filter expression string that does what you want.

See AWS Condition Expression Reference for details one the syntax and operations supported.

Condition Examples

Condition where age > 21 OR ((region = 'US' AND size(interests) > 10) AND interests contain nodejs, dynamodb, or serverless):

From: examples/Readme.Condition.ts

import { Condition, Table } from 'dynamodb-datamodel';

// Destructuring Condition to make it easier to write filter expression.
const { and, or, eq, gt, contains, size } = Condition;

const filter = or(
  gt('age', 21),
  and(
    eq('region', 'US'),
    gt(size('interests'), 10),
    or(contains('interests', 'nodejs'), contains('interests', 'dynamodb'), contains('interests', 'serverless')),
  ),
);

const params = Table.addParams({}, { conditions: [filter] }, 'filter');
expect(params.FilterExpression).toEqual(
  '(#n0 > :v0 OR (#n1 = :v1 AND size(#n2) > :v2 AND (contains(#n2, :v3) OR contains(#n2, :v4) OR contains(#n2, :v5))))',
);

Using Field methods to ensure attribute paths are correct.

From: examples/Readme.Condition.Fields.ts

import { Condition, Fields, Model, Table } from 'dynamodb-datamodel';
import { table } from './Table';

const schema = {
  age: Fields.number(),
  region: Fields.string(),
  interests: Fields.string(),
};

// Assigning the schema to a model will initialize the schema fields to use below.
new Model({ name: 'TestModel', schema, table: table as Table });

// Destructuring schema and Condition to make it easier to write filter expression.
const { age, region, interests } = schema;
const { and, or, gt } = Condition;

const filter = or(
  age.gt(21),
  and(
    region.eq('US'),
    gt(interests.size(), 10),
    or(interests.contains('nodejs'), interests.contains('dynamodb'), interests.contains('serverless')),
  ),
);

// build and validate expression
const params = Table.addParams({}, { conditions: [filter] }, 'filter');
expect(params.FilterExpression).toEqual(
  '(#n0 > :v0 OR (#n1 = :v1 AND size(#n2) > :v2 AND (contains(#n2, :v3) OR contains(#n2, :v4) OR contains(#n2, :v5))))',
);

KeyCondition Expressions

KeyCondition like Condition is a lower level component that is used by Table and Index to build key condition expression for query based DynamoDB reads. In any query there is at most two conditions, with the partition key only supporting a simple equal condition via '=' and the sort key support a single 'range' based condition.

To get more details see AWS Key Condition Expression Resource.

Sort key functions

The Partition key only supports a single condition which is equal ('='). Sort keys support several 'range' based conditions to query for a continuous range of items. You may notice that not-equal ('<>') isn't support and the reason is simply that a not-equal query would not return a continuous range of items.

| Name | Supported Types | Description | | :------------------------------------------------------------------------------------------------------ | :-------------- | :--------------------------------------------------------------------------------------------------------- | | beginsWith | S | Return all items with a sort key that begin with the value. Note: case sensitive. Example: KeyCondition. | | between | B, N, S (all) | Returns all items with a sort key that is between and inclusive of a lower and upper values passed in. | | compare | B, N, S (all) | Returns all items with a sort key that resolves to true for the operation and value passed in. | | eq | B, N, S (all) | Returns all items with a sort key that is equal to the value passed in. | | ge | B, N, S (all) | Returns all items with a sort key is greater then and equal to the value passed in. | | gt | B, N, S (all) | Returns all items with a sort key is greater then the value passed in. | | le | B, N, S (all) | Returns all items with a sort key is less then and equal to the value passed in. | | lt | B, N, S (all) | Returns all items with a sort key is less then the value passed in. |

Create your own custom key conditions

All of the above KeyCondition functions return a resolver function, allowing for custom key conditions. Though given the limited syntax supported by DynamoDB there really isn't a need for custom key conditions.

KeyCondition Examples

From: examples/KeyCondition.ts

import { KeyCondition } from 'dynamodb-datamodel';
import { table } from './Table';

// Use KeyCondition to query the table with primary key of 'P-GUID' and sort key between (and including) 'a' and 'z'
const key = {
  P: 'P-GUID',
  S: KeyCondition.between('a', 'z'),
};
const params = table.queryParams(key);

expect(params).toEqual({
  ExpressionAttributeNames: { '#n0': 'P', '#n1': 'S' },
  ExpressionAttributeValues: { ':v0': 'P-GUID', ':v1': 'a', ':v2': 'z' },
  KeyConditionExpression: '#n0 = :v0 AND #n1 BETWEEN :v1 AND :v2',
  TableName: 'ExampleTable',
});

Update Expressions

Update functions

For more details on what you can do with update expressions see DynamodDB's Update Expression guide.

Supported Update functions: | Name | Supported Types | Description | | :---------------------------------------------------------------------------------------------------------- | :-------------- | :---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | path | all | Ensures that a value is treated as a path. Example: { name: Update.path('fullname') } | | pathWithDefault | all | Ensures that a value is either a path or a default value if the path doesn't exists. Example: { name: Update.pathWithDefault('fullname', 'John Smith')}. | | default | all | Ensures an attribute has a default value. Example: { name: Update.default('John Smith') }, if name doesn't exists then set it to 'John Smith'. | | del | all | Deletes an attribute from item, can also assign null to a value. Example : { name: Update.del() } or { name: null }. | | set | all | Sets an attribute to a value, can also assign value directly. Example: { name: Update.set('John Smith') } or { name: 'John Smith } | | arithmetic | N | Preforms basic arithmetic on an attribute, used by in, dec, add and sub below. Example: { total: Update.arithmetic('total', '+', 1) }, adds 1 to total attribute. | | inc | N | Increments an attribute by a value. Example: { total: Update.inc(2) }, increments total by 2. | | dec | N | Decrements an attribute by a value. Example: { total: Update.dec(3) }, decrements total by 3. | | add | N | Sets the receiving attribute to the addition of a value to the value of an attribute. Example: { total: Update.add('count', 1) }, set total to the result of adding 1 to the value of 'count'. | | sub | N | Sets the receiving attribute to the subtraction of a value from the value of an attribute. Example: { total: Update.sub('count', 2) } , set total to the result of subtrating 1 from the value of 'count'. | | join | L | Sets the receiving attribute to the result of joining two lists. Example: { relatives: Update.join('children', 'parents') }, set the relative attribute to the result of join children and parents together. | | append | L | Appends a list to the beginning of a list attribute. { relatives: Update.append('children') }, appends the value of the children attribute to relatives. | | prepend | L | Prepends a list to the end of an list attribute. { relatives: Update.prepend('parents') }, appends the value of the children attribute to relatives. | | delIndexes | L | Deletes values from a list attribute using an array of indexes. { relatives: Update.delIndexes([1, 3]) } , deletes the 1 and 3 value from the relatives list attribute. | | setIndexes | L | Sets the values from a list attribute using a map of indexes to values. { relatives: Update.setIndexes({1:'bob', 3:'lucy'}) } , sets value at index 1 to 'bob' and value in index 3 to 'lucy' in relatives list attribute. | | addToSet | SS, NS, BS | Add values to a DynamoDB set attribute. Example: { color: Update.addToSet(table.createStringSet(['yellow', 'red'])) }, adds yellow and red to the color set attribute. | | removeFromSet | SS, NS, BS | Remove values from a DynamoDB set attribute. Example: { color: Update.removeFromSet(table.createStringSet(['blue', 'green'])) }, removes blue and green from the color set attribute. | | map | M | Sets the receiving map attribute to the result of resolving a map of Update functions. Can also support '.' delimited path values for setting deep values. Example: { address: Update.map( street: 'One Infinity Loop' ) }, sets the street attribute within address to 'One Infinity Loop''. | | model | M | Typed based wrapper around map to ensure the input matches an interface. Example: see Update.model | | modelMap | M | String based key to model map, that enforces the key strings are only used as paths. Example: see Update.modelMap |

Create your own custom update resolvers

All of the above Update functions return a Update.Resolver arrow function with the signature of: (name: string, exp: Update.Expression, type?: T) => void. When Model.update or Table.update execute each object property value will get resolved and values that have a type of 'function' will get called with the property name, an Update.Expression object and an optional type param.

It is in the implementation of the Update.Resolver functions that the update expression is constructed. This is done by calling methods on the Update.Expression object to add and get placeholders for attribute paths and values, and add expressions to one of the four support clauses: SET, REMOVE, ADD or DELETE.

After all Update.Resolver are called and all other expressions are resolved, the expressions for the update are generated and set on the input params passed to the DynamoDB DocumentClient update method.

To create custom Update functions you just need to return an arrow function that when called adds the necessary names, values and update expressions to support the functionality you need.

Note: In future versions of this library I am looking to add additional context (like Model or Field) to the Update.Expression, to enable more advanced scenarios. Let me know if you need this so I can prioritize it appropriately.

Update Examples

From: examples/Update.Model.ts

import { Fields, Model, Table, Update } from 'dynamodb-datamodel';
import { table } from './Table';

interface ModelKey {
  id: string;
}
interface ModelItem extends ModelKey {
  name: Update.String;
  revision: Update.Number;
  nickName: Update.String;
}

const model = Model.createModel<ModelKey, ModelItem>({
  schema: {
    id: Fields.split({ aliases: ['P', 'S'] }),
    name: Fields.string(),
    nickName: Fields.string(),
    revision: Fields.number(),
  },
  table: table as Table,
});

// update will: set name attribute to 'new name', delete nickName attribute and increment revision attribute by 2.
const params = model.updateParams({
  id: 'P-1.S-1',
  name: 'new name',
  nickName: Update.del(),
  revision: Update.inc(2),
});

// (jest) output of updateParams
expect(params).toEqual({
  ExpressionAttributeNames: { '#n0': 'name', '#n1': 'nickName', '#n2': 'revision' },
  ExpressionAttributeValues: { ':v0': 'new name', ':v1': 2 },
  Key: { P: 'P-1', S: 'S-1' },
  TableName: 'ExampleTable',
  UpdateExpression: 'SET #n0 = :v0, #n2 = #n2 + :v1 REMOVE #n1',
});

Best practices

  • Use generic primary key names for Tables and Indexes. Examples:
    • Table: 'P' = partition key and 'S' = sort key.
    • Index: 'G0P' = partition key and 'G0S' = sort key for Global Secondary Index #0.
  • Use generic secondary index names since they will be used for many different access patterns. Example: GSI0 or LSI0.
  • Normalize values used in index primary keys, like lower case strings or use generated values.

Resources

There are a lot of DynamoDB resources out there, with more being added every day. But I would start with the following:

Contributions and Feedback

Any and all contributions are greatly appreciated! For suggestions, feedback and bugs, please create a new issue. For contributions you can start a pull request. Feel free contact me on Twitter: @JasonCraftsCode