bingo-dynamodb-toolbox
v0.3.3
Published
A simple set of tools for working with Amazon DynamoDB and the DocumentClient.
Downloads
1
Readme
DynamoDB Toolbox
Single Table Designs have never been this easy!
The DynamoDB Toolbox is a set of tools that makes it easy to work with Amazon DynamoDB and the DocumentClient. It's designed with Single Tables in mind, but works just as well with multiple tables. It lets you define your Entities (with typings and aliases) and map them to your DynamoDB tables. You can then generate the API parameters to put
, get
, delete
, update
, query
, scan
, batchGet
, and batchWrite
data by passing in JavaScript objects. The DynamoDB Toolbox will map aliases, validate and coerce types, and even write complex UpdateExpression
s for you. 😉
Installation and Basic Usage
Install the DynamoDB Toolbox with npm:
npm i dynamodb-toolbox
Require or import Table
and Entity
from dynamodb-toolbox
:
const { Table, Entity } = require('dynamodb-toolbox')
Create a Table (with the DocumentClient):
// Require AWS SDK and instantiate DocumentClient
const DynamoDB = require('aws-sdk/clients/dynamodb')
const DocumentClient = new DynamoDB.DocumentClient()
// Instantiate a table
const MyTable = new Table({
// Specify table name (used by DynamoDB)
name: 'my-table',
// Define partition and sort keys
partitionKey: 'pk',
sortKey: 'sk',
// Add the DocumentClient
DocumentClient
})
Create an Entity:
const Customer = new Entity({
// Specify entity name
name: 'Customer',
// Define attributes
attributes: {
id: { partitionKey: true }, // flag as partitionKey
sk: { hidden: true, sortKey: true }, // flag as sortKey and mark hidden
name: { map: 'data' }, // map 'name' to table attribute 'data'
co: { alias: 'company' }, // alias table attribute 'co' to 'company'
age: { type: 'number' }, // set the attribute type
status: ['sk',0], // composite key mapping
date_added: ['sk',1] // composite key mapping
},
// Assign it to our table
table: MyTable
})
Put an item:
// Create my item (using table attribute names or aliases)
let item = {
id: 123,
name: 'Jane Smith',
company: 'ACME',
age: 35,
status: 'active',
date_added: '2020-04-24'
}
// Use the 'put' method of Customer
let result = await Customer.put(item)
The item will be saved to DynamoDB like this:
{
"pk": 123,
"sk": "active#2020-04-24",
"data": "Jane Smith",
"co": "ACME",
"age": 35
}
You can then get the data:
// Specify my item
let item = {
id: 123,
status: 'active',
date_added: '2020-04-24'
}
// Use the 'get' method of Customer
let response = await Customer.get(item)
This will return the object mapped to your aliases and composite key mappings:
{
Item: {
id: 123,
name: 'Jane Smith',
company: 'ACME',
age: 35,
status: 'active',
date_added: '2020-04-24'
}
}
This is NOT an ORM (at least it's not trying to be)
There are several really good Object-Relational Mapping tools (ORMs) out there for DynamoDB. There's the Amazon DynamoDB DataMapper For JavaScript, @Awspilot's DynamoDB project, @baseprime's dynamodb package, and many more.
If you like working with ORMs, that's great, and you should definitely give these projects a look. But personally, I really dislike ORMs (especially ones for relational databases). I typically find them cumbersome and likely to generate terribly inefficient queries (you know who you are). So this project is not an ORM, or at least it's not trying to be. This library helps you generate the necessary parameters needed to interact with the DynamoDB API by giving you a consistent interface and handling all the heavy lifting when working with the DynamoDB API. For convenience, this library will call the DynamoDB API for you and automatically parse the results, but you're welcome to just let it generate all (or just some) of the parameters for you. Hopefully this library will make the vast majority of your DynamoDB interactions super simple, and maybe even a little bit fun! 😎
Features
- Table Schemas and DynamoDB Typings: Define your Table and Entity data models using a simple JavaScript object structure, assign DynamoDB data types, and optionally set defaults.
- Magic UpdateExpressions: Writing complex
UpdateExpression
strings is a major pain, especially if the input data changes the underlying clauses or requires dynamic (or nested) attributes. This library handles everything from simpleSET
clauses, to complexlist
andset
manipulations, to defaulting values with smartly appliedif_not_exists()
to avoid overwriting data. - Bidirectional Mapping and Aliasing: When building a single table design, you can define multiple entities that map to the same table. Each entity can reuse fields (like
pk
andsk
) and map them to different aliases depending on the item type. Your data is automatically mapped correctly when reading and writing data. - Composite Key Generation and Field Mapping: Doing some fancy data modeling with composite keys? Like setting your
sortKey
to[country]#[region]#[state]#[county]#[city]#[neighborhood]
model hierarchies? DynamoDB Toolbox lets you map data to these composite keys which will both autogenerate the value and parse them into fields for you. - Type Coercion and Validation: Automatically coerce values to strings, numbers and booleans to ensure consistent data types in your DynamoDB tables. Validate
list
,map
, andset
types against your data. Oh yeah, andset
s are automatically handled for you. 😉 - Powerful Query Builder: Specify a
partitionKey
, and then easily configure your sortKey conditions, filters, and attribute projections to query your primary or secondary indexes. This library can even handle pagination with a simple.next()
method. - Simple Table Scans: Scan through your table or secondary indexes and add filters, projections, parallel scans and more. And don't forget the pagination support with
.next()
. - Filter and Condition Expression Builder: Build complex Filter and Condition expressions using a standardized
array
andobject
notation. No more appending strings! - Projection Builder: Specify which attributes and paths should be returned for each entity type, and automatically filter the results.
- Secondary Index Support: Map your secondary indexes (GSIs and LSIs) to your table, and dynamically link your entity attributes.
- Batch Operations: Full support for batch operations with a simpler interface to work with multiple entities and tables.
- Transactions: Full support for transaction with a simpler interface to work with multiple entities and tables.
- Default Value Dependency Graphs: Create dynamic attribute defaults by chaining other dynamic attribute defaults together.
- TypeScript Support: v0.3 of this library was rewritten in TypeScript to provide strong typing support. Additional work is still required to support schema typing.
Table of Contents
- DynamoDB Toolbox
- Single Table Designs have never been this easy!
- Installation and Basic Usage
- Features
- Table of Contents
- Conventions, Motivations, and Migrations from v0.1
- Tables
- Entities
- Table Properties
- Table Methods
- query(partitionKey [,options] [,parameters])
- scan([options] [,parameters])
- batchGet(items [,options] [,parameters])
- batchWrite(items [,options] [,parameters])
- transactGet(items [,options] [,parameters])
- transactWrite(items [,options] [,parameters])
- parse(entity, input [,include])
- get(entity, key [,options] [,parameters])
- delete(entity, key [,options] [,parameters])
- put(entity, item [,options] [,parameters])
- update(entity, key [,options] [,parameters])
- Entity Properties
- Entity Methods
- Filters and Conditions
- Projection Expressions
- Adding Custom Parameters and Clauses
- Additional References
- Sponsors
- Contributions and Feedback
Conventions, Motivations, and Migrations from v0.1
One of the most important goals of this library is to be as unopinionated as possible, giving you the flexibility to bend it to your will and build amazing applications. But another important goal is developer efficiency and ease of use. In order to balance these two goals, some assumptions had to be made. These include the "default" behavior of the library (all of which, btw, can be disabled with a simple configuration change). If you are using v0.1, you'll notice a lot of changes.
autoExecute
andautoParse
are enabled by default. The original version of this library only handled limited "parameter generation", so it was necessary for you to pass the payloads to theDocumentClient
. The library now provides support for all API options for each supported method, so by default, it will make the DynamoDB API call and parse the results, saving you redundant code. If you'd rather it didn't do this, you can disable it.- It assumes a Single Table DynamoDB design. Watch the Rick Houlihan videos and read Alex DeBrie's book. The jury is no longer out on this: Single Table designs are what all the cool kids are doing. This library assumes that you will have multiple "Entities" associated with a single "Table", so this requires you to instantiate a
Table
and add at least oneEntity
to it. If you have multipleTable
s and just oneEntity
type perTable
, that's fine, it'll still make your life much easier. Also,batchGet
andbatchWrite
support multiple tables, so we've got you covered. - Entity Types are added to all items. Since this library assumes a Single Table design, it needs a way to reliably distinguish between Entity types. It does this by adding an "Entity Type" field to each item in your table. v0.1 used
__model
, but this has been changed to_et
(short for "Entity Type"). Don't like this? Well, you can either disable it completely (but the library won't be able to parse entities into their aliases for you), or change the attribute name to something more snappy. It is purposefully short to minimize table storage (because item storage size includes the attribute names). Also, by default, Entities will alias this field toentity
(but you can change that too). - Created and modified timestamps are enabled by default. I can't think of many instances where created and modified timestamps aren't used in database records, so the library now automatically adds
_ct
and_md
attributes when items areput
orupdate
d. Again, these are kept purposefully short. You can disable them, change them, or even implement them yourself if you really want. By default, Entities will alias these attributes tocreated
andmodified
(customizable, of course), and will automatically apply anif_not_exists()
on updates so that thecreated
date isn't overwritten. - Option names have been shortened using camelCase. Nothing against long and descriptive names, but typing
ReturnConsumedCapacity
over and over again just seems like extra work. For simplification purposes, all API request parameters have been shortened to things likecapacity
,consistent
andmetrics
. The documentation shows which parameter they map to, but they should be intuitive enough to guess. - All configurations and options are plain JavaScript
objects
. There are lots of JS libraries that use function chaining (liketable.query('some pk value').condition('some condition').limit(50)
). I really like this style for lots of use cases, but it just feels wrong to me when using DynamoDB. DynamoDB is the OG of cloud native databases. It's configured using IaC and its API is HTTP-based and uses structured JSON, so writing queries and other interactions using its native format just seems like the right thing to do. IMO, this makes your code more explicit and easier to reason about. Youroptions
could actually be stored as JSON and (unless you're using functions to define defaults on Entity attributes) your Table and Entity configurations could be too. - API responses match the DynamoDB API responses. Something else I felt strongly about was the response signature returned by the library's methods. The DynamoDB Toolbox is a tool to help you interact with the DynamoDB API, NOT a replacement for it. ORMs typically trade ease of use with a tremendous amount of lock-in. But at the end of the day, it's just generating queries (and probably bad ones at that). DynamoDB Toolbox provides a number of helpful features to make constructing your API calls easier and more consistent, but the exact payload is always available to you. You can rip out this library whenever you want and just use the raw payloads if you really wanted to. This brings us to the responses. Other than aliasing the
Items
andAttributes
returned from DynamoDB, the structure and format of the responses is the exact same (including any other meta data returned). This not only makes the library (kind of) future proof, but also allows you to reuse or repurpose any code or tools you've already written to deal with API responses. - Attributes with NULL values are removed (by default). This was a hard one. I actually ran a Twitter poll to see how people felt about this, and although the reactions were mixed, "Remove the attributes" came out on top. I can understand the use cases for
NULL
s, but since NoSQL database attribute names are part of the storage considerations, it seems more logical to simply check for the absence of an attribute, rather than aNULL
value. You may disagree with me, and that's cool. I've provided aremoveNullAttributes
table setting that allows you to disable this and saveNULL
attributes to your heart's content. I wouldn't, but the choice is yours.
Hopefully these all make sense and will make working with the library easier.
Tables
Tables represent one-to-one mappings to your DynamoDB tables. They contain information about your table's name, primary keys, indexes, and more. They are also used to organize and coordinate operations between entities. Tables support a number of methods that allow you to interact with your entities including performing queries, scans, batch gets and batch writes.
To define a new table, import it into your script:
const { Table } = require('dynamodb-toolbox')
Then create a new Table
instance by passing in a valid Table
definition.
const MyTable = new Table({
... table definition...
})
Specifying Table Definitions
Table
takes a single parameter of type object
that accepts the following properties:
| Property | Type | Required | Description |
| -------- | :--: | :--: | ----------- |
| name | string
| yes | The name of your DynamoDB table (this will be used as the TableName
property) |
| alias | string
| no | An optional alias to reference your table when using "batch" features |
| partitionKey | string
| yes | The attribute name of your table's partitionKey |
| sortKey | string
| no | The attribute name of your table's sortKey |
| entityField | boolean
or string
| no | Disables or overrides entity tracking field name (default: _et
) |
| attributes | object
| no | Complex type that optionally specifies the name and type of each attributes (see below) |
| indexes | object
| no | Complex type that optionally specifies the name keys of your secondary indexes (see below) |
| autoExecute | boolean
| no | Enables automatic execution of the DocumentClient method (default: true
) |
| autoParse | boolean
| no | Enables automatic parsing of returned data when autoExecute
is true
(default: true
) |
| removeNullAttributes | boolean
| no | Removes null attributes instead of setting them to null
(default: true
) |
| DocumentClient | DocumentClient
| * | A valid instance of the AWS DocumentClient |
* A Table can be instantiated without a DocumentClient, but most methods require it before execution
Table Attributes
The Table attributes
property is an object
that specifies the names and types of attributes associated with your DynamoDB table. This is an optional input that allows you to control attribute types. If an Entity
object contains an attribute with the same name, but a different type, an error will be thrown. Each key in the object represents the attribute name and the value represents its DynamoDB type.
attributes: {
pk: 'string',
sk: 'number',
attr1: 'list',
attr2: 'map',
attr3: 'boolean',
...
}
Valid DynamoDB types are: string
, boolean
, number
, list
, map
, binary
, or set
.
Table Indexes
The indexes
property is an object
that specifies the names and keys of the secondary indexes on your DynamoDB table. Each key represents the index name and its value must contain an object with a partitionKey
AND/OR a sortKey
. partitionKey
s and sortKey
s require a value of type string
that references an table attribute. If you use the same partitionKey
as the table's partitionKey
, or you only specify a sortKey
, the library will recognize them as Local Secondary Indexes (LSIs). Otherwise, they will be Global Secondary Indexes (GSIs).
indexes: {
GSI1: { partitionKey: 'GSI1pk', sortKey: 'GSI1sk' },
GSI2: { partitionKey: 'test' },
LSI1: { partitionKey: 'pk', sortKey: 'other_sk' },
LSI2: { sortKey: 'data' }
}
NOTE: The index name must match the index name on your table as it will be used in queries and other operations. The index must include the table's entityField
attribute for automatic parsing of returned data.
Entities
An Entity represent a well-defined schema for a DynamoDB item. An Entity can represent things like a User, an Order, an Invoice Line Item, a Configuration Object, or whatever else you want. Each Entity
defined with the DynamoDB Toolbox must be attached to a Table
. An Entity
defines its own attributes, but can share these attributes with other entities on the same table (either explicitly or coincidentally). Entities must flag an attribute as a partitionKey
and if enabled on the table, a sortKey
as well.
Note that a Table
can have multiple Entities, but an Entity
can only have one Table
.
To define a new entity, import it into your script:
const { Entity } = require('dynamodb-toolbox')
Then create a new Entity
instance by passing in a valid Entity
definition.
const MyEntity = new Entity({
... entity definition...
})
Specifying Entity Definitions
Entity
takes a single parameter of type object
that accepts the following properties:
| Property | Type | Required | Description |
| -------- | :--: | :--: | ----------- |
| name | string
| yes | The name of your entity (must be unique to its associated Table
)
| timestamps | boolean
| no | Automatically add and manage created and modified attributes |
| created | string
| no | Override default created attribute name (default: _ct
) |
| modified | string
| no | Override default modified attribute name (default: _md
) |
| createdAlias | string
| no | Override default created alias name (default: created
) |
| modifiedAlias | string
| no | Override default modified alias name (default: modified
) |
| typeAlias | string
| no | Override default entity type alias name (default: entity
) |
| attributes | object
| yes | Complex type that specifies the schema for the entity (see below) |
| autoExecute | boolean
| no | Enables automatic execution of the DocumentClient method (default: inherited from Table) |
| autoParse | boolean
| no | Enables automatic parsing of returned data when autoExecute
evaluates to true
(default: inherited from Table) |
| table | Table
| * | A valid Table
instance |
* An Entity can be instantiated without a table
, but most methods require one before execution
Entity Attributes
The attributes
property is an object
that represents the attribute names, types, and other properties related to each attribute. Each key in the object represents the attribute name and the value represents its properties. The value can be a string
that represents the DynamoDB type, an object
that allows for additional configurations, or an array
that maps to composite keys.
Using a string
Attributes can be defined using only a string
value that corresponds to a DynamoDB type.
schema: {
attr1: 'string',
attr2: 'number',
attr3: 'list',
attr4: 'map',
...
}
Valid types are: string
, boolean
, number
, list
, map
, binary
, or set
.
Using an object
For more control over an attribute's behavior, you can specify an object as the attribute's value. Some options are specific to certain types. The following properties and options are available, all of which are optional:
| Property | Type | For Types | Description |
| -------- | :--: | :--: | ----------- |
| type | String
| all | The DynamoDB type for this attribute. Valid values are string
, boolean
, number
, list
, map
, binary
, or set
. Defaults to string
. |
| coerce | boolean
| string
, boolean
, number
, list
| Coerce values to the specified type. Enabled by default on string
, boolean
, and number
. If enabled on list
types, the interpreter will try to split a string by commas. |
| default | same as type
or function
| all | Specifies a default value (if none provided) when using put
or update
. This also supports functions for creating custom default. See more below. |
| dependsOn | string
or array
of string
s | all | Creates a dependency graph for default values. For example, if the attribute uses a default value that requires another attribute's default value, this will ensure dependent attributes' default values are calcuated first. |
| onUpdate | boolean
| all | Forces default
values to be passed on every update
. |
| save | boolean
| all | Specifies whether this attribute should be saved to the table. Defaults to true
. |
| hidden | boolean
| all | Hides attribute from returned JavaScript object when auto-parsing is enabled or when using the parse
method. |
| required | boolean
or "always" | all | Specifies whether an attribute is required. A value of true
requires the attribute for all put
operations. A string
value of "always" requires the attribute for put
and update
operations. |
| alias | string
| all | Adds a bidirectional alias to the attribute. All input methods can use either the attribute name or the alias when passing in data. Auto-parsing and the parse
method will map attributes to their alias. |
| map | string
| all | The inverse of the alias
option, allowing you to specify your alias as the key and map it to an attribute name. |
| setType | string
| set
| Specifies the type for set
attributes. Allowed values are string
,number
,binary
|
| delimiter | string
| composite keys | Specifies the delimiter to use if this attribute stores a composite key (see Using an array
for composite keys) |
| prefix | string
| string
| A prefix to be added to an attribute when saved to DynamoDB. This prefix will be removed when parsing the data. |
| suffix | string
| string
| A suffix to be added to an attribute when saved to DynamoDB. This suffix will be removed when parsing the data. |
| transform | function
| all | A function that transforms the input before sending to DynamoDB. This accepts two arguments, the value passed and an object containing the data from other attributes. |
| partitionKey | boolean
or string
| all | Flags an attribute as the 'partitionKey' for this Entity. If set to true
, it will be mapped to the Table's partitionKey
. If set to the name of an index defined on the Table, it will be mapped to the secondary index's partitionKey
|
| sortKey | boolean
or string
| all | Flags an attribute as the 'sortKey' for this Entity. If set to true
, it will be mapped to the Table's sortKey
. If set to the name of an index defined on the Table, it will be mapped to the secondary index's sortKey
|
NOTE: One attribute must be set as the partitionKey
. If the table defines a sortKey
, one attribute must be set as the sortKey
. Assignment of secondary indexes is optional. If an attribute is used across multiple indexes, an array
can be used to specify multiple values.
Example:
attributes: {
user_id: { partitionKey: true },
sk: { type: 'number', hidden: true, sortKey: true },
data: { coerce: false, required: true, alias: 'name' },
departments: { type: 'set', setType: 'string', map: 'dept' },
...
}
Using an array
for composite keys
NOTE: The interface for composite keys may be changing in v0.2 to make it easier to customize.
Composite keys in DynamoDB are incredibly useful for creating hierarchies, one-to-many relationships, and other powerful querying capabilities (see here). The DynamoDB Toolbox lets you easily work with composite keys in a number of ways. In some cases, there is no need to store the data in the same record twice if you are already combining it into a single attribute. By using composite key mappings, you can store data together in a single field, but still be able to structure input data and parse the output into separate attributes.
The basic syntax is to specify an array
with the mapped attribute name as the first element, and the index in the composite key as the second element. For example:
attributes: {
user_id: { partitionKey: true },
sk: { hidden: true, sortKey: true },
status: ['sk',0],
date: ['sk',1],
...
}
This maps the status
and date
attributes to the sk
attribute. If a status
and date
are supplied, they will be combined into the sk
attribute as [status]#[date]
. When the data is retrieved, the parse
method will automatically split the sk
attribute and return the values with status
and date
keys. By default, the values of composite keys are stored as separate attributes, but that can be changed by adding in an option configuration as the third array element.
Passing in a configuration
Composite key mappings are string
s by default, but can be overridden by specifying either string
,number
, or boolean
as the third element in the array. Composite keys are automatically coerced into string
s, so only the aforementioned types are allowed. You can also pass in a configuration object
as the third element. This uses the same configuration properties as above. In addition to these properties, you can also specify a boolean
property of save
. This will write the value to the mapped composite key, but also add a separate attribute that stores the value.
attributes: {
user_id: { partitionKey: true },
sk: { hidden: true, sortKey: true },
status: ['sk',0, { type: 'boolean', save: false, default: true }],
date: ['sk',1, { required: true }],
...
}
Customize defaults with a function
In simple situations, defaults can be static values. However, for advanced use cases, you can specify an anonymous function to dynamically calculate the value. The function takes a single argument that contains an object of the inputed data (including aliases). This opens up a number of really powerful use cases:
Generate the current date and time:
attributes: {
user_id: { partitionKey: true },
created: { default: () => new Date().toISOString() },
...
}
Generate a custom composite key:
attributes: {
user_id: { partitionKey: true },
sk: { sortKey: true, default: (data) => `sort-${data.status}|${data.date_added}` },
status: 'boolean',
date_added: 'string'
...
}
Create conditional defaults:
attributes: {
user_id: { partitionKey: true },
sk: {
sortKey: true,
default: (data) => {
if (data.status && data.date_added) {
return data.date_added
} else {
return null // field will not be defaulted
}
}
},
status: 'boolean',
date_added: 'string'
...
}
Table Properties
get/set DocumentClient
The DocumentClient
property allows you to get reference to the table's assigned DocumentClient
, or to add/update the table's DocumentClient
. When setting this property, it must be a valid instance of the AWS DocumentClient.
get/set entities
The entities
property is used to add entities to the table. When adding entities, property accepts either an array
of Entity
instances, or a single Entity
instance. This will add the entities to the table and create a table property with the same name as your entities name
. For example, if an entity with the name User
is assigned: MyTable.entities = User
, then the Entity and its properties and methods will be accessible via MyTable.User
.
The entities
property will retrieve an array
of string
s containing all entity names attached to the table.
get/set autoExecute
This property will retrieve a boolean
indicating the current autoExecute
setting on the table. You can change this setting by supplying a boolean
value.
get/set autoParse
This property will retrieve a boolean
indicating the current autoParse
setting on the table. You can change this setting by supplying a boolean
value.
Table Methods
query(partitionKey [,options] [,parameters])
The Query operation finds items based on primary key values. You can query any table or secondary index that has a composite primary key (a partition key and a sort key).
The query
method is a wrapper for the DynamoDB Query API. The DynamoDB Toolbox query
method supports all Query API operations. The query
method returns a Promise
and you must use await
or .then()
to retrieve the results. An alternative, synchronous method named queryParams
can be used, but will only retrieve the generated parameters.
The query()
method accepts three arguments. The first argument is used to specify the partitionKey
you wish to query against (KeyConditionExpression). The value must match the type of your table's partition key.
The second argument is an options
object that specifies the details of your query. The following options are all optional (corresponding Query API references in parentheses):
| Option | Type | Description |
| -------- | :--: | ----------- |
| index | string
| Name of secondary index to query. If not specified, the query executes on the primary index. The index must include the table's entityField
attribute for automatic parsing of returned data. (IndexName) |
| limit | number
| The maximum number of items to retrieve per query. (Limit) |
| reverse | boolean
| Reverse the order or returned items. (ScanIndexForward) |
| consistent | boolean
| Enable a consistent read of the items (ConsistentRead) |
| capacity | string
| Return the amount of consumed capacity. One of either none
, total
, or indexes
(ReturnConsumedCapacity) |
| select | string
| The attributes to be returned in the result. One of either string
| all_attributes
, all_projected_attributes
, specific_attributes
, or count
(Select) |
| eq | same as sortKey
| Specifies sortKey
condition to be equal to supplied value. (KeyConditionExpression) |
| lt | same as sortKey
| Specifies sortKey
condition to be less than supplied value. (KeyConditionExpression) |
| lte | same as sortKey
| Specifies sortKey
condition to be less than or equal to supplied value. (KeyConditionExpression) |
| gt | same as sortKey
| Specifies sortKey
condition to be greater than supplied value. (KeyConditionExpression) |
| gte | same as sortKey
| Specifies sortKey
condition to be greater than or equal to supplied value. (KeyConditionExpression) |
| between | array
| Specifies sortKey
condition to be between the supplied values. Array should have two values matching the sortKey
type. (KeyConditionExpression) |
| beginsWith | same as sortKey
| Specifies sortKey
condition to begin with the supplied values. (KeyConditionExpression) |
| filters | array
or object
| A complex object
or array
of objects that specifies the query's filter condition. See Filters and Conditions. (FilterExpression) |
| attributes | array
or object
| An array
or array of complex objects
that specify which attributes should be returned. See Projection Expression below (ProjectionExpression) |
| startKey | object
| An object that contains the partitionKey
and sortKey
of the first item that this operation will evaluate. (ExclusiveStartKey) |
| entity | string
| The name of a table Entity to evaluate filters
and attributes
against. |
| execute | boolean
| Enables/disables automatic execution of the DocumentClient method (default: inherited from Entity) |
| parse | boolean
| Enables/disables automatic parsing of returned data when autoExecute
evaluates to true
(default: inherited from Entity) |
If you prefer to specify your own parameters, the optional third argument allows you to add custom parameters. See Adding custom parameters and clauses for more information.
let result = await MyTable.query(
'user#12345', // partition key
{
limit: 50, // limit to 50 items
beginsWith: 'order#', // select items where sort key begins with value
reverse: true, // return items in descending order (newest first)
capacity: 'indexes', // return the total capacity consumed by the indexes
filters: { attr: 'total', gt: 100 }, // only show orders above $100
index: 'GSI1' // query the GSI1 secondary index
}
)
Return Data
The data is returned with the same response syntax as the DynamoDB Query API. If autoExecute
and autoParse
are enabled, any Items
data returned will be parsed into its corresponding Entity's aliases. Otherwise, the DocumentClient will return the unmarshalled data. If the response is parsed by the library, a .next()
method will be available on the returned object. Calling this function will call the query
method again using the same parameters and passing the LastEvaluatedKey
in as the ExclusiveStartKey
. This is a convenience method for paginating the results.
scan([options] [,parameters])
The Scan operation returns one or more items and item attributes by accessing every item in a table or a secondary index.
The scan
method is a wrapper for the DynamoDB Scan API. The DynamoDB Toolbox scan
method supports all Scan API operations. The scan
method returns a Promise
and you must use await
or .then()
to retrieve the results. An alternative, synchronous method named scanParams
can be used, but will only retrieve the generated parameters.
The scan()
method accepts two arguments. The first argument is an options
object that specifies the details of your scan. The following options are all optional (corresponding Scan API references in parentheses):
| Option | Type | Description |
| -------- | :--: | ----------- |
| index | string
| Name of secondary index to scan. If not specified, the query executes on the primary index. The index must include the table's entityField
attribute for automatic parsing of returned data. (IndexName) |
| limit | number
| The maximum number of items to retrieve per scan. (Limit) |
| consistent | boolean
| Enable a consistent read of the items (ConsistentRead) |
| capacity | string
| Return the amount of consumed capacity. One of either none
, total
, or indexes
(ReturnConsumedCapacity) |
| select | string
| The attributes to be returned in the result. One of either all_attributes
, all_projected_attributes
, specific_attributes
, or count
(Select) |
| filters | array
or object
| A complex object
or array
of objects that specifies the scan's filter condition. See Filters and Conditions. (FilterExpression) |
| attributes | array
or object
| An array
or array of complex objects
that specify which attributes should be returned. See Projection Expression below (ProjectionExpression) |
| startKey | object
| An object that contains the partitionKey
and sortKey
of the first item that this operation will evaluate. (ExclusiveStartKey) |
| segments | number
| For a parallel scan
request, segments
represents the total number of segments into which the scan
operation will be divided. (TotalSegments) |
| segment | number
| For a parallel scan
request, segment
identifies an individual segment to be scanned by an application worker. (Segment) |
| entity | string
| The name of a table Entity to evaluate filters
and attributes
against. |
| execute | boolean
| Enables/disables automatic execution of the DocumentClient method (default: inherited from Entity) |
| parse | boolean
| Enables/disables automatic parsing of returned data when autoExecute
evaluates to true
(default: inherited from Entity) |
If you prefer to specify your own parameters, the optional second argument allows you to add custom parameters. See Adding custom parameters and clauses for more information.
let result = await MyTable.scan(
{
limit: 100, // limit to 50 items
capacity: 'indexes', // return the total capacity consumed by the indexes
filters: { attr: 'total', between: [100,500] }, // only return orders between $100 and $500
index: 'GSI1' // scan the GSI1 secondary index
}
)
Return Data
The data is returned with the same response syntax as the DynamoDB Scan API. If autoExecute
and autoParse
are enabled, any Items
data returned will be parsed into its corresponding Entity's aliases. Otherwise, the DocumentClient will return the unmarshalled data. If the response is parsed by the library, a .next()
method will be available on the returned object. Calling this function will call the scan
method again using the same parameters and passing the LastEvaluatedKey
in as the ExclusiveStartKey
. This is a convenience method for paginating the results.
batchGet(items [,options] [,parameters])
The BatchGetItem operation returns the attributes of one or more items from one or more tables. You identify requested items by primary key.
The batchGet
method is a wrapper for the DynamoDB BatchGetItem API. The DynamoDB Toolbox batchGet
method supports all BatchGetItem API operations. The batchGet
method returns a Promise
and you must use await
or .then()
to retrieve the results. An alternative, synchronous method named batchGetParams
can be used, but will only retrieve the generated parameters.
The batchGet
method accepts three arguments. The first is an array
of item keys to get. The DynamoDB Toolbox provides the getBatch
method on your entities to help you generate the proper key configuration. You can specify different entity types as well as entities from different tables, and this library will handle the proper payload construction.
The optional second argument accepts an options
object. The following options are all optional (corresponding BatchGetItem API references in parentheses):
| Option | Type | Description |
| -------- | :--: | ----------- |
| consistent | boolean
or object
(see below) | Enable a consistent read of the items (ConsistentRead) |
| capacity | string
| Return the amount of consumed capacity. One of either none
, total
, or indexes
(ReturnConsumedCapacity) |
| attributes | array
or object
(see below) | An array
or array of complex objects
that specify which attributes should be returned. See Projection Expression below (ProjectionExpression) |
| execute | boolean
| Enables/disables automatic execution of the DocumentClient method (default: inherited from Entity) |
| parse | boolean
| Enables/disables automatic parsing of returned data when autoExecute
evaluates to true
(default: inherited from Entity) |
Specifying options for multiple tables
The library is built for making working with single table designs easier, but it is possible that you may need to retrieve data from multiple tables within the same batch get. If your items
contain references to multiple tables, the consistent
option will accept objects that use either the table name
or alias
as the key, and the setting as the value. For example, to specify different consistent
settings on two tables, you would use something like following:
consistent: {
'my-table-name': true,
'my-other-table-name': false
}
Setting either value without the object
structure will set the option for all referenced tables. If you are referencing multiple tables and using the attributes
option, then you must use the same object
method to specify the table name
or alias
. The value should follow the standard Projection Expression formatting.
const results = await MyTable.batchGet(
[
MyTable.User.getBatch({ family: 'Brady', name: 'Mike' }),
MyTable.User.getBatch({ family: 'Brady', name: 'Carol' }),
MyTable.Pet.getBatch({ family: 'Brady', name: 'Tiger' })
],
{
capacity: 'total',
attributes: [
'name', 'family',
{ User: ['dob', 'age'] },
{ Pet: ['petType','lastVetCheck'] }
]
}
}
)
If you prefer to specify your own parameters, the optional third argument allows you to add custom parameters. See Adding custom parameters and clauses for more information.
Return Data
The data is returned with the same response syntax as the DynamoDB BatchGetItem API. If autoExecute
and autoParse
are enabled, any Responses
data returned will be parsed into its corresponding Entity's aliases. Otherwise, the DocumentClient will return the unmarshalled data. If the response is parsed by the library, a .next()
method will be available on the returned object. Calling this function will call the batchGet
method again using the same options and passing any UnprocessedKeys
in as the RequestItems
. This is a convenience method for retrying unprocessed keys.
batchWrite(items [,options] [,parameters])
The BatchWriteItem operation puts or deletes multiple items in one or more tables. A single call to BatchWriteItem can write up to 16 MB of data, which can comprise as many as 25 put or delete requests.
The batchWrite
method is a wrapper for the DynamoDB BatchWriteItem API. The DynamoDB Toolbox batchWrite
method supports all BatchWriteItem API operations. The batchWrite
method returns a Promise
and you must use await
or .then()
to retrieve the results. An alternative, synchronous method named batchWriteParams
can be used, but will only retrieve the generated parameters.
The batchWrite
method accepts three arguments. The first is an array
of item keys to either put
or delete
. The DynamoDB Toolbox provides a putBatch
and deleteBatch
method on your entities to help you generate the proper key configuration for each item. You can specify different entity types as well as entities from different tables, and this library will handle the proper payload construction.
The optional second argument accepts an options
object. The following options are all optional (corresponding BatchWriteItem API references in parentheses):
| Option | Type | Description |
| -------- | :--: | ----------- |
| capacity | string
or object
(see below) | Return the amount of consumed capacity. One of either none
, total
, or indexes
(ReturnConsumedCapacity) |
| metrics | string
| Return item collection metrics. If set to size
, the response includes statistics about item collections, if any, that were modified during the operation are returned in the response. One of either none
or size
(ReturnItemCollectionMetrics) |
| execute | boolean
| Enables/disables automatic execution of the DocumentClient method (default: inherited from Entity) |
| parse | boolean
| Enables/disables automatic parsing of returned data when autoExecute
evaluates to true
(default: inherited from Entity) |
NOTE: The BatchWriteItem
does not support conditions or return deleted items. "BatchWriteItem does not behave in the same way as individual PutItem and DeleteItem calls would. For example, you cannot specify conditions on individual put and delete requests, and BatchWriteItem does not return deleted items in the response." ~ DynamoDB BatchWriteItem API
const result = await Default.batchWrite(
[
MyTable.User.putBatch({ family: 'Brady', name: 'Carol', age: 40, roles: ['mother','wife'] }),
MyTable.User.putBatch({ family: 'Brady', name: 'Mike', age: 42, roles: ['father','husband'] }),
MyTable.Pet.deleteBatch({ family: 'Brady', name: 'Tiger' })
],{
capacity: 'total',
metrics: 'size',
}
)
If you prefer to specify your own parameters, the optional third argument allows you to add custom parameters. See Adding custom parameters and clauses for more information.
Return Data
The data is returned with the same response syntax as the DynamoDB BatchWriteItem API. If autoExecute
and autoParse
are enabled, a .next()
method will be available on the returned object. Calling this function will call the batchWrite
method again using the same options and passing any UnprocessedItems
in as the RequestItems
. This is a convenience method for retrying unprocessed keys.
transactGet(items [,options] [,parameters])
TransactGetItems is a synchronous operation that atomically retrieves multiple items from one or more tables (but not from indexes) in a single account and Region.
The transactGet
method is a wrapper for the DynamoDB TransactGetItems API. The DynamoDB Toolbox transactGet
method supports all TransactGetItem API operations. The transactGet
method returns a Promise
and you must use await
or .then()
to retrieve the results. An alternative, synchronous method named transactGetParams
can be used, but will only retrieve the generated parameters.
The transacthGet
method accepts three arguments. The first is an array
of item keys to get. The DynamoDB Toolbox provides the getTransaction
method on your entities to help you generate the proper key configuration. You can specify different entity types as well as entities from different tables, and this library will handle the proper payload construction.
The optional second argument accepts an options
object. The following options are all optional (corresponding TransactGetItems API references in parentheses):
| Option | Type | Description |
| -------- | :--: | ----------- |
| capacity | string
| Return the amount of consumed capacity. One of either none
, total
, or indexes
(ReturnConsumedCapacity) |
| execute | boolean
| Enables/disables automatic execution of the DocumentClient method (default: inherited from Table) |
| parse | boolean
| Enables/disables automatic parsing of returned data when autoExecute
evaluates to true
(default: inherited from Table) |
Accessing items from multiple tables
Transaction items are atomic, so each Get
contains the table name and key necessary to retrieve the item. The library will automatically handle adding the necessary information and will parse each entity automatically for you.
const results = await MyTable.transactGet(
[
User.getTransaction({ family: 'Brady', name: 'Mike' }),
User.getTransaction({ family: 'Brady', name: 'Carol' }),
Pet.getTransaction({ family: 'Brady', name: 'Tiger' })
],
{ capacity: 'total' }
)
If you prefer to specify your own parameters, the optional third argument allows you to add custom parameters. See Adding custom parameters and clauses for more information.
Return Data
The data is returned with the same response syntax as the DynamoDB TransactGetItems API. If autoExecute
and autoParse
are enabled, any Responses
data returned will be parsed into its corresponding Entity's aliases. Otherwise, the DocumentClient will return the unmarshalled data.
transactWrite(items [,options] [,parameters])
TransactWriteItems is a synchronous write operation that groups up to 25 action requests. The actions are completed atomically so that either all of them succeed, or all of them fail.
The transactWrite
method is a wrapper for the DynamoDB TransactWriteItems API. The DynamoDB Toolbox transactWrite
method supports all TransactWriteItems API operations. The transactWrite
method returns a Promise
and you must use await
or .then()
to retrieve the results. An alternative, synchronous method named transactWriteParams
can be used, but will only retrieve the generated parameters.
The transactWrite
method accepts three arguments. The first is an array
of item keys to either put
, delete
, update
or conditionCheck
. The DynamoDB Toolbox provides putTransaction
,deleteTransaction
, updateTransaction
, and conditionCheck
methods on your entities to help you generate the proper configuration for each item. You can specify different entity types as well as entities from different tables, and this library will handle the proper payload construction.
The optional second argument accepts an options
object. The following options are all optional (corresponding TransactWriteItems API references in parentheses):
| Option | Type | Description |
| -------- | :--: | ----------- |
| capacity | string
| Return the amount of consumed capacity. One of either none
, total
, or indexes
(ReturnConsumedCapacity) |
| metrics | string
| Return item collection metrics. If set to size
, the response includes statistics about item collections, if any, that were modified during the operation are returned in the response. One of either none
or size
(ReturnItemCollectionMetrics) |
| token | string
| Optional token to make the call idempotent, meaning that multiple identical calls have the same effect as one single call. (ClientRequestToken) |
| execute | boolean
| Enables/disables automatic execution of the DocumentClient method (default: inherited from Entity) |
| parse | boolean
| Enables/disables automatic parsing of returned data when autoExecute
evaluates to true
(default: inherited from Entity) |
const result = await Default.transactWrite(
[
Pet.conditionCheck({ family: 'Brady', name: 'Tiger' }, { conditions: { attr: 'alive', eq: false } },
Pet.deleteTransaction({ family: 'Brady', name: 'Tiger' }),
User.putTransaction({ family: 'Brady', name: 'Carol', age: 40, roles: ['mother','wife'] }),
User.putTransaction({ family: 'Brady', name: 'Mike', age: 42, roles: ['father','husband'] })
],{
capacity: 'total',
metrics: 'size',
}
)
If you prefer to specify your own parameters, the optional third argument allows you to add custom parameters. See Adding custom parameters and clauses for more information.
Return Data
The data is returned with the same response syntax as the DynamoDB TransactWriteItems API.
parse(entity, input [,include])
Executes the parse
method of the supplied entity
. The entity
must be a string
that references the name of an Entity associated with the table. See the Entity parse
method for additional parameters and behavior.
get(entity, key [,options] [,parameters])
Executes the get
method of the supplied entity
. The entity
must be a string
that references the name of an Entity associated with the table. See the Entity get
method for additional parameters and behavior.
delete(entity, key [,options] [,parameters])
Executes the delete
method of the supplied entity
. The entity
must be a string
that references the name of an Entity associated with the table. See the Entity delete
method for additional parameters and behavior.
put(entity, item [,options] [,parameters])
Executes the put
method of the supplied entity
. The entity
must be a string
that references the name of an Entity associated with the table. See the Entity put
method for additional parameters and behavior.
update(entity, key [,options] [,parameters])
Executes the update
method of the supplied entity
. The entity
must be a string
that references the name of an Entity associated with the table. See the Entity update
method for additional parameters and behavior.
Entity Properties
get/set table
Retrieves a reference to the Table instance that the Entity is attached to. You can use this property to add the Entity to a Table by assigning it a valid Table instance. Note that you cannot change a table once it has been assigned.
get DocumentClient
The DocumentClient
property retrieves a reference to the table's assigned DocumentClient
. This value cannot be updated by the Entity.
get/set autoExecute
This property will retrieve a boolean
indicating the current autoExecute
setting on the entity. If no value is set, it will return the inherited value from the attached table. You can change this setting for the current entity by supplying a boolean
value.
get/set autoParse
This property will retrieve a boolean
indicating the current autoParse
setting on the entity. If no value is set, it will return the inherited value from the attached table. You can change this setting for the current entity by supplying a boolean
value.
get partitionKey
Returns the Entity's assigned partitionKey
.
get sortKey
Returns the Entity's assigned sortKey
.
Entity Methods
attribute(attribute)
Returns the Table's attribute name for the suppled attribute
. The attribute
must be a string
and can be either a valid attribute name or alias.
parse(input [,include])
Parses attributes returned from a DynamoDB action and unmarshalls them into entity aliases. The input
argument accepts an object
with attributes as keys, an array
of objects
with attributes as keys, or an object
with either an Item
or Items
property. This method will return a result of the same type of input
. For example, if you supply an array
of objects, an array
will be returned. If you supply an object with an Item
property, an object
will be returned.
You can also pass in an array
of strings as the second argument. The unmarshalling will only return the attributes (or aliases) specified in this include
array.
If auto execute and auto parsing are enable, data returned from a DynamoDB action will automatically be parsed.
get(key [,options] [,parameters])
The GetItem operation returns a set of attributes for the item with the given primary key.
The get
method is a wrapper for the DynamoDB GetItem API. The DynamoDB Toolbox get
method supports all GetItem API operations. The get
method returns a Promise
and you must use await
or .then()
to retrieve the results. An alternative, synchronous method named getParams
can be used, but will only retrieve the generated parameters.
The get
method accepts three arguments. The first argument accepts an object
that is used to specify the primary key of the item you wish to "get" (Key). The object
must contain keys for the attributes that represent your partitionKey
and sortKey
(if a compound key) with their values as the key values. For example, if user_id
represents your partitionKey
, and status
represents your sortKey
, to retrieve user_id "123" with a status of "active", you would specify { user_id: 123, status: 'active' }
as your key
.
The optional second argument accepts an options
object. The following options are all optional (corresponding GetItem API references in parentheses):
| Option | Type | Description |
| -------- | :--: | ----------- |
| consistent | boolean
| Enable a consistent read of the items (ConsistentRead) |
| capacity | string
| Return the amount of consumed capacity. One of either none
, total
, or indexes
(ReturnConsumedCapacity) |
| attributes | array
or object
| An array
or array of complex objects
that specify which attributes should be returned. See Projection Expression below (ProjectionExpression) |
| execute | boolean
| Enables/disables automatic execution of the DocumentClient method (default: inherited from Entity) |
| parse | boolean
| Enables/disables automatic parsing of returned data when autoExecute
evaluates to true
(default: inherited from Entity) |
If you prefer to specify your own parameters, the optional third argument allows you to add custom parameters. See Adding custom parameters and clauses for more information.
// Specify my key
let key = {
id: 123,
status: 'active',
date_added: '2020-04-24'
}
// Use the 'get' method of MyEntity to retrieve the item from DynamoDB
let result = await MyEntity.get(
key,
{ consistent: true }
)
delete(key [,options] [,parameters])
Deletes a single item in a table by primary key.
The delete
method is a wrapper for the DynamoDB DeleteItem API. The DynamoDB Toolbox delete
method supports all DeleteItem API operations. The delete
method returns a Promise
and you must use await
or .then()
to retrieve the results. An alternative, synchronous method named deleteParams
can be used, but will only retrieve the generated parameters.
The delete
method accepts three arguments. The first argument accepts an object
that is used to specify the primary key of the item you wish to "delete" (Key). For example: { user_id: 123, status: 'active' }
The optional second argument accepts an options
object. The following options are all optional (corresponding DeleteItem API references in parentheses):
| Option | Type | Description |
| -------- | :--: | ----------- |
| conditions | array
or object
| A complex object
or array
of objects that specifies the conditions that must be met to delete the item. See Filters and Conditions. (ConditionExpression) |
| capacity | string
| Return the amount of consumed capacity. One of either none
, total
, or indexes
(ReturnConsumedCapacity) |
| metrics | string
| Return item collection metrics. If set to size
, the response includes statistics about item collections, if any, that were modified during the operation are returned in the response. One of either none
or size
(ReturnItemCo