piper-utils
v1.0.45
Published
utility library for piper
Downloads
942
Readme
Piper-Utils
This repository contains a collection of utilities for working with Cognito Lambda Sequelize and the Serverless Framework. It provides simple to use utility functions that speed up API development by bridging gaps in several curated frameworks. These frameworks include:
- Serverless Framework (Cognito, API Gateway, Lambda)
- Sequelize
- Joi
Installation
Import items using npm bitbucket link in package.json
"piper-utils":"git+ssh://[email protected]/kevinlbatchelor/piper-utils.git",
Add ssh key to repo to allow npm install to work
Publish
npm run build
- Tag and Commit
TEST
npm run test
accessRightsUtils
The access rights functions provide a simple way to manage row level multi-tenancy security by following a few conventions. The first convention is the concept of a businessId. by adding this field to every table in a database the access rights utils provide a simple way ensure each API call filters out data by indicating which businessId the user has access to by comparing the incoming JWT and applying filters to the queries.
- Requires a custom attribute to be added to the Cognito User Pool of "AR" (access rights)
- Access Rights should be stored in Cognito custom attribute field in this format
{ businessIds: { '3VLvfHjk9V': 'A' } }
- By convention 3 access rights are supported "A" (admin) "W" (write) "R" (read) for each businessId
- accessRightsUtils function takes an API Gateway Lambda Event object and returns a intersection list of requested businessIds and allowed businessIds
- If set up correct in cognito the custom:AR property will automatically be included in the event.
- Ensure custom:AR is mutable and read and write are set to true in Cognito App Client Settings on the User Pool
const businessIds = accessRightsUtils(event);
parseBody
- Takes the incoming Lambda event and returns the post body as a JSON object
const eventBody = parseBody(event);
getCurrentUser
- Takes the event and returns an object with the current user's id and email address from Cognito
- Requires a custom field created in Cognito UID (user ID) that us mutable and set to read and write in App Client Settings
const user = getCurrentUser(event);
user = {id: '123', email: '[email protected]'}
checkWriteAccess
- Takes the event, looks for a businessId property in the event body, and returns the businessId if the user has write access to it. If the user does not have write access, it will throw an exceptionn.
const businessId = checkWriteAccess(event);
createFilters
Create Filters automatically generates a Sequelize where clause from a http query string. Create Filters first takes the event and piper-utils filter object.
- Query Params are already parsed by the Serverless/API Gateway and automatically included in the lambda event
- Create the filters piper filter object by passing a sequelize mode to defaultFilters function in piper utils
const filterObject = defaultFilters(sequelizeModelDef)
const where = createFilters(event, filterObject);
- Most properties in your Sequelize model are automatically available as a filter in your http query string and a few additional options are added as a convention. For example the following url query string (or any combination of) will all available when using Create Filter and Create Sort:
String, Boolean and Number fields if your model all get sensible defaults, which can be overridden
https://someapi.com?title=value&description=value&price=100&active=true
Query for a value that is on a sub-model to a Sequelize model by including the model name and the field seperated by a dot note that the sub model schema must be included in the defaultFilters arguments
https://someapi.com?customer.name=jim
const filter = defaultFilters(model, customerSchema);
const where = createFilters(event, filter);
If a comma seperated list is the value of a string type field, then these values will be filtered using SQL "IN" logic
https://someapi.com?description=good,bad,ugly
Sequelize automatically adds date fields to tables so a date rage is automatically created for the createdAt field
https://someapi.com?startDate=2024-02-14T07:00:00.000Z&endDate=2024-02-16T06:59:59.999
JSONB field types can be filtered by passing appropriately escaped JSON, below UOM field is a JSONB array and can be filtered in the URl by the following:
https://someapi.com?&uoms=\[{"name":"EA","conversion":1}\]
Sort are handled by specifying the field in the model to sort by.
https://someapi.com?&sort=updatedAt
Prefix a dash to specify a descending sort
https://someapi.com?&sort=-updatedAt
Piper-Utils assumes a multi-tenancy field on every model and automatically bolts on an in filter for businessIds
https://someapi.com?businessIds=abc,def,efg
A search property of "searchStrings" will apply or logic search to all string fields on a model unless an override is applied by Default Filters function
https://someapi.com?searchStrings=someStingToFind
Here is an example of using all of these functions with Sequelize and a Serverless Lambda Function
export const productModel = {
title: { type: db.STRING, allowNull: false },
description: { type: db.TEXT('long') },
price: { type: db.DECIMAL, defaultValue: 0 },
active: { type: db.BOOLEAN, defaultValue: true }
};
const filter = defaultFilters({
...productModel,
id: { type: db.INTEGER, filterType: db.Op.in }
}, subModel);
export async function getProducts(event) {
try {
checkModule(moduleName, event);
const queryStringParameters = event.queryStringParameters
const where = createFilters(event, filter);
const order = createSort(event, productSort);
const limit = parseInt(_.get(queryStringParameters, 'limit', '10'));
const offset = parseInt(_.get(queryStringParameters, 'offset', '0'));
const product = await Product.findAll({
where,
order,
offset,
limit
});
return success(product, { dbClose });
} catch (err) {
return failure(err, { dbClose });
}
}
createSort
- Takes the event and sequelize object sort and creates a Sequelize order with applied sort
const order = createSort(event, partSort);
createIncludes
- Takes the event and sequelize object includes and creates a Sequelize includes from query string params
const includes = createIncludes(event, partSort);
findAll
- Takes a Sequelize object and options, wraps sequelize findAll in a try/catch, and returns the results with pagination and applies default options. Applies accessRightsUtils
const part = await findAll(Part, {
where,
order,
offset,
limit
}
);
success
- Takes json object and options. Returns a json object with the status code and the json object. Closes the database connection if it is open. Creates correct http response object and status code.
const response = success(part, { dbClose });
return success(part, { dbClose });
//response
{
statusCode: 200,
headers: {
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Credentials': true
},
body: //JSON BODY
}
successHtml
- Return HTML instead of JSON
successHtml(`
<html>
<script>setTimeout(() => {window.close();},1000)</script>
<div>Connection Success</div>
</html>`,
{ dbClose });
failure
- Takes error object and options. Returns a json object with the status code and the error object. Closes the database connection if it is open. Creates correct http response object and status code. Logs error to console. Reads out Sequelize errors, Axios and others to readable format. setts correct error message and error status code
- The failure logs and returns errors correctly from JOI if JOI async validations exceptions are thrown
const response = failure(part, { dbClose });
return failure(error, { dbClose });
//error
{
statusCode: 500, errorCode: '5XX', message: ''
}
//response
{
statusCode: 500, headers: {
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Credentials': true
}, body: '{"statusCode":500,"errorCode":"5XX","message":""}'
}
s3 Bucket Watch
- Usage The watchBucket function listens for events in an S3 bucket and processes the files using a specified transformer function. It supports both SNS and direct S3 event triggers, and handles errors by moving files to specific directories based on failure history.
import { watchBucket } from 's3-bucket-watcher';
async function transformerFunction(fileContent) {
// Custom processing logic for the file content
console.log('Processing file:', fileContent);
return transformedContent;
}
async function errorHandler(err, filePath) {
// Custom error handling logic
console.error('Error processing file:', filePath, err);
return true; // Return true to continue processing; false to stop
}
const event = {
// Example event object structure from S3 or SNS
};
const options = {
dynamoConfigTable: 'ConfigTable',
dynamoConfigKey: 'configKey',
s3Bucket: 'your-s3-bucket-name',
snsTopic: 'your-sns-topic',
transformer: transformerFunction,
errorHandlerPerFile: errorHandler,
shouldSkipFailedFolders: true,
userImportTypes: {
type1: 'type1',
type2: 'type2'
}
};
try {
await watchBucket({
event,
dynamoConfigTable: options.dynamoConfigTable,
dynamoConfigKey: options.dynamoConfigKey,
s3Bucket: options.s3Bucket,
snsTopic: options.snsTopic,
transformer: options.transformer,
errorHandlerPerFile: options.errorHandlerPerFile,
shouldSkipFailedFolders: options.shouldSkipFailedFolders,
userImportTypes: options.userImportTypes
});
} catch (error) {
console.error('Failed to process bucket events:', error);
}
Parameters
watchBucket(params)
params
(object): The configuration object containing the following properties:event
(object): The Lambda event object, containing either SNS or S3 event details.dynamoConfigTable
(string): The DynamoDB table name where configuration data (e.g., chunk size, max message size) is stored.dynamoConfigKey
(string): The key to retrieve specific configuration settings from the DynamoDB table.s3Bucket
(string): The name of the S3 bucket to watch for file uploads.snsTopic
(string): The SNS topic to which events are published.transformer
(function): A function to transform the content of each file retrieved from S3.errorHandlerPerFile
(function, optional): A function to handle errors on a per-file basis during processing.shouldSkipFailedFolders
(boolean, optional): Flag indicating whether to skip retrying files that have already failed twice, moving them directly to theerror
folder.userImportTypes
(object, optional): A mapping of user import subdirectories for categorizing and processing files differently based on their directory.
Event Handling
The watchBucket
function distinguishes between different types of AWS events and processes them accordingly:
SNS Events:
- The function processes SNS-triggered events and applies the
transformer
function to files listed in the SNS message.
- The function processes SNS-triggered events and applies the
S3 Events:
- If the event originates directly from S3 and the object key starts with "DIRECT", it handles the file as a direct S3 event.
- If the file key indicates a failure (e.g.,
failed-once
,failed-twice
,error
), the function adjusts the path and moves the file to the appropriate error directory.
Cron Job Execution:
- If the function is triggered by a cron job (not from SNS or S3), it publishes events to the SNS topic specified.
Example Use Case
Imagine a system where files are frequently uploaded to an S3 bucket for processing. The watchBucket
function allows you to:
- Automatically listen to file upload events in the S3 bucket.
- Apply custom processing logic to the files.
- Handle any errors by moving files to appropriate directories for retries or logging.
Error Handling
- Automatic Retry Logic: The function automatically retries processing by moving files to
failed-once
andfailed-twice
directories before finally moving them to anerror
directory if all attempts fail. - Custom Error Handling: Provide a custom
errorHandlerPerFile
function to control error handling behavior on a per-file basis.