npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

dynongo

v0.19.0

Published

MongoDB like syntax for DynamoDB

Downloads

6,342

Readme

dynongo

CI codecov

MongoDB like syntax for DynamoDB

Installation

npm install --save dynongo

Usage

Connect

First of all, we have to connect with the database.

const db = require('dynongo');

db.connect();

Credentials

Please use IAM roles or environment variables to connect with the dynamodb database. This way, no keys have to be embedded in your code. You can find more information on the SDK page.

If you still want to use embedded credentials, you can by providing an accessKeyId, secretAccessKey and an optional region property.

db.connect({
	accessKeyId: 'AKIAI44QH8DHBEXAMPLE',
	secretAccessKey: 'wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY',
	region: 'us-west-1'
});

Or if you rather work with temporary security credentials, you can do that as well.

db.connect({
	accessKeyId: 'AKIAI44QH8DHBEXAMPLE',
	secretAccessKey: 'wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY',
	sessionToken: 'AQoDYXdzEJr...<remainder of security token>',
	region: 'us-west-1'
});

Retry

The retry configuration can be passed during initialisation, or per individual query. The mechanism is based on p-retry and requires the same options. Configuring retry will allow the user to automatically retry the DynamoDB operation if it's a retryable error.

db.connect({
	retries: {
		retries: 3,
		factor: 1,
		randomize: false
	}
})

You can simply pass a number as well when you don't want to configure the retry strategy.

db.connect({
	retries: 3
})

DynamoDB Local

It is possible to connect to a local DynamoDB database by setting the local property to true. It will use port 8000 by default, but if you want to change that port, you can provide a localPort property.

db.connect({
	local: true,
	host: '192.168.5.5',            // localhost if not provided
	localPort: 4444                 // 8000 if not provided
});

Prefixing tables

It's a good thing to prefix the tables with the name of the project and maybe the environment like production or staging. Instead of always repeating those names every time you want to query the table, you can provide the prefix and prefix delimiter once. The default delimiter is the ..

db.connect({
	prefix: 'myapp-development',
	prefixDelimiter: '-'            // . if not provided
});

Tables

In order for the developer to execute methods on a table, you have to retrieve the table object from the database.

const Employee = db.table('Employee');

The table name will be automatically prefixed by the prefix provided in the connection object.

If you provided a prefix in the connection object but you don't want that for a specific table, you could ask for a raw table. A raw table is like a regular table without the prefix.

const Employee = db.rawTable('Employee');

Methods

Every method can override the retry options passed with the .connect() method or can customise the retry configuration for the specific method.

Employee
	.find({Organisation: 'Amazon'})
	.where({Salary: {$gt: 3000}})
	.select('FirstName Name')
	.retry({retries: 3, factor: 1, randomize: false})
	.exec()
	.then(employees => {
		// => [{FirstName: 'Foo', Name: 'Bar'}]
	});

If you don't want to configure the retry strategy, you can simply pass the number of retries.

Employee
	.find({Organisation: 'Amazon'})
	.where({Salary: {$gt: 3000}})
	.select('FirstName Name')
	.retry(2)
	.exec()
	.then(employees => {
		// => [{FirstName: 'Foo', Name: 'Bar'}]
	});

find

Employee.find({Organisation: 'Amazon'}).where({Salary: {$gt: 3000}}).select('FirstName Name').exec()
	.then(employees => {
		// => [{FirstName: 'Foo', Name: 'Bar'}]
	});

findOne

Employee.findOne({Organisation: 'Amazon'}).where({Salary: {$between: [3000, 4000]}}).select('FirstName Name').exec()
	.then(employee => {
		// => {FirstName: 'Foo', Name: 'Bar'}
	});

count

Employee.find({Organisation: 'Amazon'}).where({Salary: {$gt: 3000}}).count().exec()
	.then(count => {
		// => 8
	});

insert

Employee.insert({Organisation: 'Amazon', Email: '[email protected]'}, {Title: 'CFO', HiredAt: 'last year', FirstName: 'Foo', Name: 'Bar', Salary: 4500}).exec()
	.then(employee => {
		// => {FirstName: 'Foo', Name: 'Bar', Salary: 4500, Title: 'CFO', HiredAt: 'last year', Organisation: 'Amazon', Email: '[email protected]'}
	});

update

The first parameter in the update method is the primary key (hash + range) and the second method is a query that defines the updates of the fields.

You can use $set: { field: { $ifNotExists: value } } to only set the value if the field does not exists on the record

Employee.update({Organisation: 'Amazon', Email: '[email protected]'}, {$set: {Title: 'CTO', HiredAt: {$ifNotExists: 'today'}}, $inc: {Salary: 150}, $push: {Hobby: {$each: ['swimming', 'walking']}}}).exec()
	.then(employee => {
		// => {FirstName: 'Foo', Name: 'Bar', Salary: 4650, Title: 'CTO', HiredAt: 'last year', Organisation: 'Amazon', Email: '[email protected]', Hobby: ['cycling', 'swimming', 'walking']}
	});

Or, if working with Sets you can use $addToSet to add unique values to a Set, it supports single value, arrays and $each operator.

Employee.update({Organisation: 'Amazon', Email: '[email protected]'}, {$addToSet: {Departments: ['IT', 'IT', 'HR']}}).exec()
	.then(employee => {
		// => {FirstName: 'Foo', Name: 'Bar', Salary: 4650, Title: 'CTO', Organisation: 'Amazon', Email: '[email protected]', Hobby: ['cycling', 'swimming', 'walking'], Departments: ['IT', 'HR']}
	});

You can use $removeFromSet to remove one, or many elements from sets

Employee.update({Organisation: 'Amazon', Email: '[email protected]'}, {$removeFromSet: {Departments: ['IT']}}).exec()
	.then(employee => {
		// => {FirstName: 'Foo', Name: 'Bar', Salary: 4650, Title: 'CTO', Organisation: 'Amazon', Email: '[email protected]', Hobby: ['cycling', 'swimming', 'walking'], Departments: ['HR']}
	});

You can use $unshift to prepend a list with one or multiple values.

Employee.update({Organisation: 'Amazon', Email: '[email protected]'}, {$unshift: {Hobby: 'programming'}}).exec()
	.then(employee => {
		// => {FirstName: 'Foo', Name: 'Bar', Salary: 4650, Title: 'CTO', Organisation: 'Amazon', Email: '[email protected]', Hobby: ['programming', 'cycling', 'swimming', 'walking'], Departments: ['IT']}
	});

If no Amazon employee exists with that email address exists, the method will fail.

You can also add extra conditions, for instance if we want to increase the salary by $150 only if the current salary is less then $4500.

Employee.update({Organisation: 'Amazon', Email: '[email protected]'}, {$inc: {Salary: 150}}).where({Salary: {$lt: 4500}}).exec()
	.catch(err => {
		// ConditionalCheckFailedException: The conditional request failed
	});

remove

The remove method expects the primary key (hash + range).

Employee.remove({Organisation: 'Amazon', Email: '[email protected]'}).exec()
	.then(() => {
		// => removed
	});

findOneAndRemove

This method is the same as the remove method, except that it will return the removed record..

Employee.findOneAndRemove({Organisation: 'Amazon', Email: '[email protected]'}).exec()
	.then(result => {
		// => {Organisation: 'Amazon', Email: '[email protected]'}
	});

Read consistency

By default, all reads are eventually consistent which means the response migh include some stale data.

When you request a strongly consistent read, DynamoDB returns a response with the most up-to-date data, reflecting the updates from all prior write operations that were successful.

Dynongo supports strongly consistent reads by adding the .consistent() chaining operator.

Employee
	.find({Organisation: 'Amazon'})
	.where({Salary: {$gt: 3000}})
	.select('FirstName Name')
	.consistent()
	.exec()
	.then(employees => {
		// => [{FirstName: 'Foo', Name: 'Bar'}]
	});

More information can be found in the AWS documentation.

Batch Write

The BatchWriteItem operation puts or deletes multiple items in one or more tables. A single call to BatchWrite can write up to 16 MB of data, which can comprise as many as 25 put or delete requests. Individual items to be written can be as large as 400 KB.

You can create Put and Delete request by calling the method on Table with the correct parameters.

const result = await db.batchWrite(
	Table1.createBatchPutItem(
		{partitionKey: 'PK', sortKey: 'SK'},
		{name: 'Sander', lastname: 'Machado'}
	),
	Table1.createBatchPutItem(
		{partitionKey: 'PK', sortKey: 'SK23'},
		{name: 'Sander', lastname: 'Doe'}
	),
	Table2.createBatchDeleteItem(
		{partitionKey: '123', sortKey: '456'}
	),
	Table2.createBatchDeleteItem(
		{partitionKey: 'PK2', sortKey: 'SK3'}
	),
	Table2.createBatchPutItem(
		{partitionKey: 'PK', sortKey: 'SK'},
		{name: 'name', lastname: 'lastname'}
	)
).exec();

Transactions

The library also supports transactions. Transactions simplify the developer experience of making coordinated, all-or-nothing changes to multiple items both within and across tables. You can only provide up to 10 transaction requests per transaction.

Read Transactions

import dynongo from 'dynongo';

const result = await dynongo
	.transactRead(
		dynongo.table('User')
			.find({Id: '1234', Key: 'BankRoll'}),
		dynongo.table('BankAccount')
			.find({Key: 'Salary'})
	)
	.exec();

//=> [{Id: '1234', Key: 'BankRoll', Value: 100}, {Key: 'Salary', Value: 1500}]

Write Transactions

For instance, what if we want to increment the bankroll of a user, but only if we still have enough money on our own back account.

import dynongo from 'dynongo';

await dynongo
	.transactWrite(
		dynongo.table('User')
			.update({Id: '1234', Key: 'BankRoll'}, {$inc: {Amount: 150}})
	)
	.withConditions(
		dynongo.table('BankAccount')
			.find({Key: 'Salary'})
			.where({value: {$gte: 150}})
	)
	.exec();

List all the tables

You can retrieve a list of all the tables.

db.listTables().exec().then(tables => {
	console.log(tables);
	//=> ['foo', 'bar', 'baz']
});

If you passed in a prefix property in the connection object, only the tables with that prefix will be returned.

Paging

You can implement paging by using the startFrom() method together with the LastEvaluatedKey property returned when using the raw() method.

const result = Employee.find({Organisation: 'Amazon'}).where({Salary: {$gt: 3000}}).limit(1).raw().exec()
	.then(result => {
		/**
		 * {
		 *     "Items": [
		 *         { UserId: '1', FirstName: 'Foo', Name: 'Bar' }
		 *     ],
		 *     "Count": 1,
		 *     "ScannedCount": 1,
		 *     "LastEvaluatedKey": {
		 *         Organisation: 'Amazon',
		 *         UserId: '1'
		 *     }
		 * }
		 */

		// Retrieve the next page
		return Employee.find({Organisation: 'Amazon'}).where({Salary: {$gt: 3000}}).startFrom(result.LastEvaluatedKey).limit(1).raw().exec()
	})
	.then(result => {
		/**
		 * {
		 *     "Items": [
		 *         { UserId: '2', FirstName: 'Unicorn', Name: 'Rainbow' }
		 *     ],
		 *     "Count": 1,
		 *     "ScannedCount": 1,
		 *     "LastEvaluatedKey": {
		 *         Organisation: 'Amazon',
		 *         UserId: '2'
		 *     }
		 * }
		 */
	});

You can also use dynongo-pager to make paging even easier.

Create a table

A table can be created by either calling create() on a table instance or by calling createTable on the database instance.

The first way is by calling the create() method.

const Employee = db.table('Employee');

const schema = {
	TableName: 'Employee',
	AttributeDefinitions: [
		{ AttributeName: 'id', AttributeType: 'S' }
	],
	KeySchema: [
		{ AttributeName: 'id', KeyType: 'HASH' }
	],
	ProvisionedThroughput: {
		ReadCapacityUnits: 1,
		WriteCapacityUnits: 1
	}
};

Employee.create(schema).exec()
	.then(() => {
		// => Table is being created
	});

The second way is by calling the createTable() method.

db.createTable(schema).exec()
	.then(() => {
		// Table is being created
	});

This is shorthand for the first method.

Awaiting the result

Creating a table can take a while. The previous examples do not wait for the action to be completed. But there might be use cases where you have to wait untill the table is created entirely before continuing. This can be done with the wait() method.

db.createTable(schema).wait().exec()
	.then(() => {
		// Table is created
	});

This will make sure the table is polled every 1000 milliseconds untill the status of the table is active. If you want to poll at another speed, you can by providing the number of milliseconds in the wait method.

db.createTable(schema).wait(5000).exec();

This will poll the status of the table every 5 seconds instead of every second.

Drop a table

A table can be dropped by either calling drop() on a table instance or by calling dropTable() on the database instance.

The first way is by calling the drop() method.

const Employee = db.table('Employee');

Employee.drop().exec()
	.then(() => {
		// => Table is being dropped
	});

The second way is by calling the dropTable() method.

db.dropTable('Employee').exec()
	.then(() => {
		// => Table is being dropped
	})

This method is just a shorthand method for the first example.

Awaiting the result

Dropping a table can take a while, especially when the table has a lot of data. The previous examples do not wait for the action to be completed. But there might be use cases where you have to wait untill the table is removed entirely before continuing. This can be done with the wait() method.

db.dropTable('Employee').wait().exec()
	.then(() => {
		// => Table is dropped
	})

This will make sure the table is polled every 1000 milliseconds untill the table does not exist anymore. If you want to poll at another speed, you can by providing the number of milliseconds in the wait method.

db.dropTable('Employee').wait(5000).exec();

This will poll the status of the table every 5 seconds instead of every second.

Related

License

MIT © Sam Verschueren