npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

dynabridge

v0.1.6

Published

Simple and light-weight TypeScript entity-focused wrapper for DynamoDB

Downloads

443

Readme

Simple and light-weight TypeScript entity-focused wrapper for DynamoDB

Install via npm: npm install dynabridge

Who is this intended for?

You have a full stack web application or some other Node server written in TypeScript, and you’re kind of abusing DynamoDB as your relational database? Are you storing your entities in multiple tables, even though Alex DeBrie has told you over and over and over again not do it?

After attending multiple re:Invents, watching every YouTube video on Single Table Design, and surviving a two-week bootcamp on how to properly overload Global Secondary Indexes, you might finally be able to implement a simple to-do application using the DynamoDB way. But the moment you need to add a feature with a new access pattern or explain it all to a colleague, it feels like you’re on the verge of a nervous breakdown.

In the end, most of us know that DynamoDB is just not the right tool for our use case (especially when requirements and access patterns change) but its just so simple and dirt cheap - especially when building a serverless application using Lambda.

This library is here to ease the pain of abusing DynamoDB as a "relational database". It won’t make it right, but it might make it a bit less painful by bridging the gap.

What's the difference to other DynamoDB wrappers or ORMs?

There are plenty of other ORMs and wrappers for DynamoDB out there. Many of them seem abandoned, lack traction, or just feel overly complex. There are definitely useful libraries out there, but they have a different goal in mind. The aim of this library is to keep things as simple as possible, with a minimal footprint, while still providing all the essential features you need for your CRUD operations.

Key features and selling points

  • No use of decorators - keeping your types and interfaces clean
  • Extremely easy to use
  • Type-safe CRUD for your data model
  • On-the-fly migrations

Developing with DynaBridge

A typical data model may look like this

// src/domain/types.ts
export interface Company {
  id: string;
  name: string;
}

export interface Employee {
  companyId: string;
  employeeNumber: number;
  firstName: string;
  lastName: string;
}

Setting up DynaBridge should be straight-forward

// src/repository/index.ts
import { DynaBridge, DynaBridgeEntity } from 'dynabridge';
import { Company, Employee } from '../domain/types';

const companyEntity: DynaBridgeEntity<Company> = {
  tableName: 'company',
  id: 'id'
};

const employeeEntity: DynaBridgeEntity<Employee> = {
  tableName: 'employee',
  id: ['companyId', 'employeeNumber']
};

export const db = new DynaBridge({
  company: companyEntity,
  employee: employeeEntity
});

Note: The ID of Company is id, this means that the DynamoDB table needs to be configured with a hash key with name id of type string. On the other hand, an Employee is identified by the combination of the companyId and its employeeNumber, therefore its DynamoDB table needs to have a hash key (companyId of type string) and a range key (employeeNumber, type number). Setting up and deploying the DynamoDB tables are outside DynaBridge scope and preferably done using some IaC tooling.

Using the client in your application code

// ./src/index.ts
import { db } from './repository';
import { Company, Employee } from './domain/types';

const someCompany1: Company = {
  id: 'c1',
  name: 'Test company 1'
};

const someCompany2: Company = {
  id: 'c2',
  name: 'Test company 2'
};

const someEmployee1: Employee = {
  companyId: 'c1',
  employeeNumber: 1,
  firstName: 'John',
  lastName: 'Doe'
};

const someEmployee2: Employee = {
  companyId: 'c1',
  employeeNumber: 2,
  firstName: 'Foo',
  lastName: 'Bar'
};

Write single entity

await db.entities.company.save(someCompany1);
await db.entities.employee.save(someEmployee1);

Write multiple entities

await db.entities.company.saveBatch([someCompany1, someCompany2]);
await db.entities.employee.saveBatch([someEmployee1, someEmployee2]);

Fetch entity by id

const company: Company | undefined = await db.entities.company.findById('c1');
const employee: Employee | undefined = await db.entities.employee.findById(['c1', 1]);

Fetch multiple entities by id

const companies: Company[] = await db.entities.company.findByIds(['c1', 'c2']);
const employees: Employee[] = await db.entities.employee.findByIds([['c1', 1], ['c1', 2]]);

Fetch all entities

const allCompanies: Company[] = await db.entities.company.findAll();
const allEmployees: Employee[] = await db.entities.employee.findAll();

Delete entity

await db.entities.company.delete(someCompany1);
await db.entities.employee.delete(someEmployee1);

Delete multiple entities

await db.entities.company.deleteBatch([someCompany1, someCompany2]);
await db.entities.employee.deleteBatch([someEmployee1, someEmployee2]);

Delete entity by id

await db.entities.company.deleteById('c1');
await db.entities.employee.deleteById(['c1', 1]);

Delete multiple entities by id

await db.entities.company.deleteByIds(['c1', 'c2']);
await db.entities.employee.deleteByIds([['c1', 1], ['c1', 2]]);

Transaction

await db.transaction([
  { action: 'Put', type: 'company', entity: someCompany1 },
  { action: 'Put', type: 'employee', entity: someEmployee1 },
  {
    action: 'Update',
    type: 'employee',
    entity: someEmployee2,
    updateExpression: 'SET #firstName = :newName',
    expressionAttributeValues: { ':newName': 'Charlie' }
  },
  { action: 'Delete', type: 'company', entity: someCompany2 }
]);

Schema migrations

One major pain point with DynamoDB and with NoSQL in general is schema versioning and migration.

An Employee entity is written to the table today. A few days later, new feature requirements mandate that employees must have a mandatory role field. To accommodate this, the Employee type and the application are updated accordingly. However, when loading an existing employee record written before this change, the role field will be missing. Attempting to access this field without proper handling can lead to unexpected behavior, such as application crashes, inconsistent data processing or inaccurate presentation.

In relational databases, schema migrations are often used to handle such changes - there are plenty of solutions and tools (Liquibase, Flyway). They will make sure to migrate all the data, which ensures that all entities adhere to the latest schema. With NoSQL, by design, schema migrations are hard.

On-the-fly migrations

DynaBridge addresses this challenge by applying on-the-fly migrations.

When entities are written to the database, they are stored with their current version (starting at 1). When reading these entities, DynaBridge compares them to the latest schema version and, if necessary, applies the corresponding migration functions. This ensures that the application always works with entities that conform to the current schema, maintaining consistency and preventing the issues mentioned above. When a migrated entity is saved back to the database, its version is updated to the latest version. This guarantees that changes to the entity are properly stored and ensures that migration functions will not need to be applied again when the entity is loaded at a later time.

On-the-fly migrations are simple, resource-efficient, and ideal when there are no downstream processes that depend on the database always containing the latest schema.

Example

Current item in DynamoDB table employee

{ "companyId": "c1", "employeeNumber": 1, "firstName": "John", "lastName": "Doe", "_version": 1, "_updated_at": "..." }

Updated Employee type

// src/domain/types.ts
type EmployeeRole = 'Manager' | 'Sales' | 'Developer' | 'HR' | 'Other';

interface Employee {
  companyId: string;
  employeeNumber: number;
  firstName: string;
  lastName: string;
  role: EmployeeRole;
}

Updated DynaBridge Employee entity

// src/repository/index.ts
import { DynaBridgeEntity } from 'dynabridge';
import { Employee } from '../domain/types';

/* It is recommended to keep a "hard" copy of the schema versions for type-safety. 
   One could possibly use things like intersection types, Omit or Partial, but this
   will not always work and makes reasoning about the different schemas harder. */
interface EmployeeV1 {
  companyId: string;
  employeeNumber: number;
  firstName: string;
  lastName: string;
}

export const employeeEntity: DynaBridgeEntity<Employee> = {
  tableName: 'employee',
  id: ['companyId', 'employeeNumber'],
  migrations: [
    (v1: EmployeeV1) => ({ ...v1, role: 'Other' })
  ]
}

When fetching the item using the .findById, .findByIds or .findAll API, the result would be

const employee: Employee = await db.entities.employee.findById(['c1', 1]);
console.log(employee) // { companyId: "c1", employeeNumber: 1, firstName: "John", lastName: "Doe", role: "Other" }

Saving the entity using the .save, .saveBatch or .transaction will overwrite the existing item in the table with the updated version

await db.entities.employee.save(employee);

Updated item in DynamoDB table employee

{ "companyId": "c1", "employeeNumber": 1, "firstName": "John", "lastName": "Doe", "role": "Other", "_version": 2, "_updated_at": "..." }

DynaBridge API details

DynaBridge API is using the following DynamoDB API / SDK commands

  • .save
    • Uses DynamoDBDocumentClient and PutCommand
  • .saveBatch
    • Uses DynamoDBDocumentClient and BatchWriteCommand
    • UnprocessedItems retries = 3
    • batch_size = 100
  • .findById
    • Uses DynamoDBClient and GetItemCommand
  • .findByIds:
    • Uses DynamoDBClient, and BatchGetItemCommandInput
    • UnprocessedKeys retries = 3
    • batch_size = 100
    • All requested items will be returned (no pagination)
  • .findAll:
    • Uses DynamoDBDocumentClient and ScanCommand
    • sequentiell (TotalSegments = 1)
    • All requested items will be returned (no pagination)
  • .delete and .deleteById
    • Uses DynamoDBClient and DeleteItemCommand
  • .deleteBatch and .deleteByIds
    • Uses DynamoDBClient and DeleteItemCommand
    • UnprocessedItems retries = 3
    • batch_size = 100
  • .transaction
    • Uses DynamoDBDocumentClient and TransactWriteCommand