npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

ddb-simple-migrate

v1.1.2

Published

change every item in a dynamodb table

Downloads

10

Readme

ddb-simple-migrate

change every item in a dynamodb table

installation

yarn

The integration tests use docker-compose to manage dynamo tables, install it here. If you would like to run the integration tests, run this first:

docker-compose build

testing

unit tests:

yarn test

integration tests that serve as examples:

docker-compose run test

usage

important! It's critical that before migrating a table the table is placed in On-Demand billing mode (docs on this matter are here). Trying to balance provisioned capacity and throughput for a table in Provisioned billing mode during a migration like this is terrible and I don't recommend it.

const { migrate } = require('ddb-simple-migrate')
import { migrate } from 'ddb-simple-migrate'

Basic usage is covered by the examples in test/examples.ts, covering one or multiple tables, stream or batch mode, and forcing a migration on a provisioned table.

Most of the time, this library will be used in a node script. The script in test/exampleScript.js is a good example of what will generally happen in those scripts.

parameter information

  • TableName the name of the dynamo table
  • filterCb callback to filter out unneeded items. return true to migrate item
  • cb async callback to change item before writing back to table. is passed the Item, the counts object, and the batchLog function. not called in batch mode. return the changed item.
  • batchCb async callback to change a batch of items. is passed the Items array, the counts object, the batchLog function, the batchWrite function, and the dlq array. only called in batch mode. does not return, is responsible for writing changed Items to the table.
  • scanDelay ms to wait between scan batches, defaults to 0.
  • writeDelay ms to wait between write batches, defaults to 0.
  • mode either "batch" or "stream" (default). "stream" calls "cb" for each item in the table, while "batch" calls "batchCb" for each scan batch, and expects that "batchWrite" is explicitly called.
  • dynamoOptions options that are passed to the dynamo client, consists of:
    • region the AWS region. defaults to 'us-east-1'
    • endpoint for dynamo tables. if not provided, defaults to the AWS default endpoint for "region"
    • accessKeyId the AWS access key id, part of AWS credentials
    • secretAccessKey the AWS secret access key, part of AWS credentials
  • customCounts only valid in "batch" mode. initializes each string provided in "counts", for keeping track of different values. Prints them at the end.
  • saveDlq defaults to true. saves dlq to a json file in the current directory, including table name, batch requests, and dynamo error. the dlq is also returned from the operation.
  • quiet defaults to false. when true, silences all log output.
  • force defaults to false. when true, allows migration on provisioned-mode table.
  • asScript defaults to true. most of the time, this will be run as part of a node script, and needs to listen for Ctrl-C to quit.

tips

This library tries to migrate your dynamo table at a single-partition scale, to keep things simple (Partition splits occur at ~1000 WCU or ~3000 RCU, for more information on partitions see the docs).

It scans only batches of 25 items at a time to keep read throughput down. If for whatever reason the read throughput during the migration gets too close to 3000 RCU, you should stop the migration and introduce a scanDelay, generally between 0-50ms. This is pretty unlikely!

If for whatever reason the write throughput during the migration gets too close to 1000 WCU (which is more likely), you should stop the migration and start it again with a writeDelay, generally between 0-50 ms. Try starting the migration over with a delay, and experiment with higher delay times until you find a time that keeps the throughput low.

keeping the wcu low

  1. Filter out as many items as possible, migrating only the items that absolutely need it.
  2. Introduce a writeDelay of a few ms, upping that value until the WCU goes down.
  3. If writing to two or more tables, delay the writes to the second table for a few ms.