npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@bespoken-api/data-access

v1.0.2

Published

This package contains the data access layer for the Bespoken API. It is responsible for all interactions with bespoken data stores.

Downloads

5

Readme

@bespoken-api/data-access

Overview

This package contains the data access layer for the Bespoken API. It is responsible for all interactions with bespoken data stores.

Data sources

  • MySql 8 Database server: mysql.bespoken.io (refer to Batch Tester MySQL for credentials)
  • Firebase This includes libraries for:
    • admin
    • client
  • MongoDB

Usage

This package runs isolated and does not require more configurations than setting the environment where runs to.

| Environment | Value | Description | | ----------- | ----------- | ----------- | | DA_ENV | dev or prod | Indicates the environment where it will connect. This will use one o another encrypted-dev.env or encrypted-prod.env file |

Note: All env vars for running are saved in the repository( they are encrypted using SOPS and AGE).

Just for reference the list of env vars and its soures are listed below:

| Environment | Description | | ----------- | ----------- | | DA_ENV | It must be set on the project where it will be used. Indicates the environment where it will connect. This will use one o another encrypted-dev.env or encrypted-prod.env file | | DA_PRISMA_BESPOKEN_DB | Url to connect the MySql database on mys1l8.bespoken.io. | | DA_PRISMA_BESPOKEN_DB_SHADOW | Url to connect shadow MySql database on mys1l8.bespoken.io. This is only for development when the migration is calculated. | | DA_FIREBASE_KEY | Firebase configuration to access the admin API level. Gives access to manitpulate the data or create accounts | | DA_FIREBASE_EMAIL | Firebase configuration to access the admin API level. Gives access to manitpulate the data or create accounts | | DA_FIREBASE_PROJECT | Firebase configuration to access the admin API level. Gives access to manitpulate the data or create accounts | | DA_FIREBASE_URL | Firebase configuration to access the admin API level. Gives access to manitpulate the data or create accounts | | DA_FIREBASE_CLIENT_API_KEY | Firebase configuration to access the client API level. | to validate Firebase toke(like JWT authentication when user login to dashboard) | | DA_FIREBASE_CLIENT_AUTH_DOMAIN| Firebase configuration to access the client API level. Give access to validate Firebase toke(like JWT authentication when user login to dashboard) | | DA_FIREBASE_CLIENT_DATABASE_URL| Firebase configuration to access the client API level. Give access to validate Firebase toke(like JWT authentication when user login to dashboard) | | DA_FIREBASE_CLIENT_MESSAGING_SENDER_ID| Firebase configuration to access the client API level. Give access to validate Firebase toke(like JWT authentication when user login to dashboard) | | DA_FIREBASE_CLIENT_STORAGE_BUCKET| Firebase configuration to access the client API level. Give access to validate Firebase toke(like JWT authentication when user login to dashboard) | | DA_FIREBASE_CLIENT_TOKEN| Firebase configuration to access the client API level. Give access to validate Firebase toke(like JWT authentication when user login to dashboard) | | DA_MONGO_URL | To connect to MongoDB | | DA_GITHUB_ACCESS_TOKEN | To coonect to Github API to manage test suite files | | DA_GITHUB_ORGANIZATION | To coonect to Github API to manage test suite files | | DA_JWT_PRIVATE_KEY_BASE64 | JWT token key for authentication. This is encoded in base64. To support the internal-api authentication | | DA_INTUIT_CLIENT_ID | Quickbooks API credentials | | DA_INTUIT_CLIENT_SECRET | Quickbooks API credentials | | DA_INTUIT_ENVIRONMENT | Quickbooks API credentials | | DA_INTUIT_REDIRECTURI | Quickbooks API credentials | | DA_INTUIT_COMPANY_ID | Quickbooks API credentials | | DA_DYNAMODB_REGION | Configuration to access AWS DynamoDB. It uses one db for dev and prod, orgin field is used for diferenctiate between environments | | DA_DYNAMODB_ACCESS_KEY_ID | Configuration to access AWS DynamoDB. | | DA_DYNAMODB_SECRET_ACCESS_KEY | Configuration to access AWS DynamoDB. | | DA_VIRTUALDEVICE_ALEXA_OAUTH_URL | Alexa configuration for creating access tokens when virtual device is created. (For Bots. Console: https://developer.amazon.com/alexa/console/avs/products/AlexaBot/details/info) | | DA_VIRTUALDEVICE_ALEXA_AVS_CLIENT_ID | Alexa configuration for creating access tokens when virtual device is created. (For Bots) | | DA_VIRTUALDEVICE_ALEXA_AVS_CLIENT_SECRET | Alexa configuration for creating access tokens when virtual device is created. (For Bots) | | DA_VIRTUALDEVICE_ALEXA_AVS_PRODUCT_ID | Alexa configuration for creating access tokens when virtual device is created. (For Bots) | | DA_VIRTUALDEVICE_ALEXAMUSIC_OAUTH_URL | Alexa configuration for creating access tokens when virtual device is created. (For Music device, Console: https://developer.amazon.com/alexa/console/avs/products/VirtualDeviceMusic/details/info) | | DA_VIRTUALDEVICE_ALEXAMUSIC_AVS_CLIENT_ID | Alexa configuration for creating access tokens when virtual device is created. (For Music device) | | DA_VIRTUALDEVICE_ALEXAMUSIC_AVS_CLIENT_SECRET | Alexa configuration for creating access tokens when virtual device is created. (For Music device) | | DA_VIRTUALDEVICE_ALEXAMUSIC_AVS_PRODUCT_ID | Alexa configuration for creating access tokens when virtual device is created. (For Music device) | | DA_VIRTUALDEVICE_GOOGLE_OAUTH_URL | Google configuration for creating access tokens when virtual device is created.(Console: https://console.cloud.google.com/apis/credentials/oauthclient/969501293302-726fr1b001sg3lg1ouk4u3l0m77h3skh.apps.googleusercontent.com?authuser=2&project=silent-echo) | | DA_VIRTUALDEVICE_GOOGLE_ASSISTANT_CLIENT_ID | Google configuration for creating access tokens when virtual device is created. | | DA_VIRTUALDEVICE_GOOGLE_ASSISTANT_CLIENT_SECRET | Google configuration for creating access tokens when virtual device is created. | | DA_VIRTUALDEVICE_CRYPT_KEY_REFESH_TOKEN | Google configuration for creating access tokens when virtual device is created. | | DA_VIRTUALDEVICE_JWT_STATE_BASE64 | Google configuration for creating access tokens when virtual device is created. |

Example of usage

// 1. Set the environment where it will run
export DA_ENV=dev

// 2. Add the package to your project
pnpm add @bespoken-api/package-x add @bespoken-api/data-access

// 2. Import the dao class from the package and run
const { AppSettingsDao } = require("@bespoken-api/data-access")
const dao = new AppSettingsDao()
const dto = await dao.readAppSettings('dashboard')

Development

Adding new env vars

For adding new env vars, you must decrypt the file into .env. Then copy .env content, replace it on the encrypted-*.env, and then encrypt this file again.

Use prefix DA_ for all vars

:exclamation: BE CAREFUL ENCRYPTING TWICE BY MISTAKE.

Requires to have SOPS and AGE installed in your machine. Also, have the Public and Prive keys to encrypt/decrypt the files. see: README.md

There is a file for each environment encrypted-dev.env and encrypted-prod.env. To add a new env var, you need to add it to each of the encrypted files.

You can use the following steps to add a new env var:

  1. Decrypt one of the file, for example, encrypted-dev.env (cwd: data-access folder)
pnpm -w run sops:decrypt --file=./encrypted-dev.env
  1. Open the file .env and copy the content( this content will be used to replace the content of encrypted-dev.env)
## Prisma
DA_ENV="dev"
DA_OTHER_VAR="other value"
(...)
  1. Replace the content of encrypted-dev.env with the content of .env and add the new env var
## Prisma
DA_ENV="dev"
DA_OTHER_VAR="other value"
(...)
## New group
DA_NEW_VAR="new value"
  1. Encrypt the file again
pnpm -w run sops:encrypt --file=./encrypted-dev.env
  1. Repeat steps 1 to 4 for encrypted-prod.env

Reading env vars from code

The reading process is done by the sopsConfig function. This function is called in the index.js file of the package and sets the value in the object process.env . It is called only once and it is called with the encrypted-dev.env or encrypted-prod.env file depending on the value of DA_ENV env var.

Then modify the config class in /libs/utils/configuration.js

(...)
const config = new class {
  _cache = {}

  get DA_ENV() { return process.env.DA_ENV }
  (...)
  
  // Add the new variable 
  get DA_NEW_VAR() { return process.env.DA_ENV }

  // If requires to use read multi-line env var encode it in base64 and then decode in this class 
  get DA_JWT_PRIVATE_KEY() {
    if (isUndefined(this._cache?.DA_JWT_PRIVATE_KEY)) {
      this._cache.DA_JWT_PRIVATE_KEY = Buffer.from(toString(process?.env?.DA_JWT_PRIVATE_KEY_BASE64), 'base64').toString('utf-8')
    }
    return this._cache?.DA_JWT_PRIVATE_KEY
  }

}
(...)

Then to use it in code, import the config object and use it as follows:

const { config: { DA_NEW_VAR }
} = require('../utils/configuration')

console.log(DA_NEW_VAR)

Adding new data sources

New data sources should be added in the folder /libs/datasources/ file.

Adding new DAOs

New DAOs should be added in the folder /libs/daos/ file.

Develop database schema modifications

Read Prisma Documentation for more information about Prisma Migrations

Prisma requires to prebuild a client library before execution. This is done automatically when running prisma:migrate or prisma:draft commands. And can be run manually by running with prisma:generate. This library depends on the platform where it's running to prebuild this lib. This is configured on the prisma/schema.prisma file in the client.binaryTargets those files should be committed to git.

The following commands are provided for development and publishing purposes. EACH COMMAND WILL AUTOMATICALLY DECRYPT THE encrypted-{env}.env depending on the required for its execution.

Create and run a migration for development purposes

pnpm --filter=@bespoken-api/data-access run prisma:migrate

Create a customized migration

pnpm --filter=@bespoken-api/data-access run prisma:draft

Deploy changes in production

pnpm --filter=@bespoken-api/data-access run prisma:draft

Generate client library

pnpm --filter=@bespoken-api/data-access run prisma:draft

Usage examples

Add a new column
  1. Modify the schema.prisma file adding the new column(set nullable or)
(...)
model AppSetting {
  id      Int    @id @default(autoincrement())
  app     String
  name    String
  content Json
  // new column
  new_column   String?

  @@map("app_settings")
}
(...)
  1. Run prisma:migrate command

Remember changes are applied to bespoken_dev database

Command will ask for the migration name. Set a descriptive name for it and commit the changes to git.

pnpm --filter=@bespoken-api/data-access run prisma:migrate
Add new column from values in another column
  1. Modify the schema.prisma file adding the new column(set nullable or)
(...)
model AppSetting {
  id      Int    @id @default(autoincrement())
  app     String
  name    String
  content Json
  // new column
  new_column   String?

  @@map("app_settings")
}
(...)
  1. Run prisma:migrate command

Remember changes are applied to bespoken_dev database

Command will ask the migration name set a descriptive name for it.

pnpm --filter=@bespoken-api/data-access run prisma:migrate
  1. Create a custom migration to copy the data

Command will ask the migration name. Set a descriptive name for it.

pnpm --filter=@bespoken-api/data-access run prisma:draft
  1. Edit the migration file created in the previous step(find the new migration file into the folder prisma/migrations)
-- This is an empty migration.
update app_settings
SET
new_column = name;
  1. Run the migration
pnpm --filter=@bespoken-api/data-access run prisma:migrate
  1. Remove the old column
(...)
model AppSetting {
  id      Int    @id @default(autoincrement())
  app     String
  content Json
  // new column
  new_column   String?

  @@map("app_settings")
}
(...)
  1. Run the migration
pnpm --filter=@bespoken-api/data-access run prisma:migrate

Other commands for development purposes are provided.

MySql using Prisma

The packages uses Prisma to interact with the MySql database. Prisma is an ORM that generates a client library based on the database schema. The client library is used to interact with the database.

NPM Scripts

Generate Client Library

This command generates the native client library based on prisma configurations. It runs automatically when push or migrate commands are executed.

pnpm --filter=@bespoken-api/data-access run prisma:generate