event-handler-tasks
v1.13.3
Published
Task and Process code for event-service to support tenants
Downloads
21
Readme
event-service-tasks
Task and Process code for event-service to support tasks, including for customer tenants.
This repository is meant to be turned into a Docker image, and then made available as a Task in AWS ECS/Fargate, Docker or Kubernetes via event-service. This service has Triggers monitoring S3 uploads (from SFTP) and other events, and then launching the code in this repository to deal with files as they are uploaded.
The main entry point is expected to be the sbin/runTask.sh
shell script, to
which the task spawned by event-service
should provide a futher parameter
to distinguish which file is being processed.
Adding/Updating New Processes/Tasks/Handlers
- Modify
sbin/runTask.sh
to launch custom code when called with a unique argument (such as a task name). - Add custom code to handle the processing that can be called from the
runTask.sh
script.
Environment Variables
The following environment variables will be provided from event-service
(refer to the launch
method of the
FeedModel
object in event-service
)
- LOG_LEVEL - Set to the current logging level that
event-service
is at, defaulting toinfo
. Possible values arefatal
,error
,warn
,info
,debug
,trace
orsilent
. - LOG_TIMESTAMP - Set to the current timestamp flag within
event-service
, defaulting tofalse
. Possible values aretrue
andfalse
-- wheretrue
will cause log entries to include a timestamp. - AIX_FEED_ID - String value for the unique Feed identifier under which the currently executing Task is being run.
- AIX_REQUEST_ID - String value for the unique request identifier under
which the currently executing Task is being run. (Best practice is to include
this value in your log statements:
const log = require('@trade-platform/log').child({requestid: process.env.AIX_REQUEST_ID});
) - AIX_TRIGGERING_EVENT - JSON-encoded string for the event object that triggered the Task.
- AIX_BASE_URL - The base URL for the
event-service
that is invoking the Task. This can be used with the (event-client)[https://www.npmjs.com/package/@trade-platform/event-client] library to communicate information back toevent-service
.
The runTask.sh
script will also pull the following values from AWS SSM
Parameter Store and export them as environment variables:
- AIX_DB_URI - a URI, of the form
mysql://username:password@host:port/database
, for reaching a dedicated MySQL database. - AUTH0_TENANT - the Auth0 tenant, needed to obtain a token for
event-service
. - AUTH0_AUDIENCE - the Auth0 audience, needed to obtain a token for
event-service
. - AUTH0_CLIENT_ID - the Auth0 client id, needed to obtain a token for
event-service
. - AUTH0_CLIENT_SECRET - the Auth0 client secreate, needed to obtain a token
for
event-service
.
Note that the above values can be different since they take the value of the NODE_ENV environment variable into account when doing the AWS SSM PS lookup.
Refer to sbin/setSecrets.sh
for pushing these values up to the parameter
store.
Debugging
Things go wrong. The code in this repository is (should be) designed to help make debugging problems a little easier.
- You can check out the code repository, configure a
.env
file in the root directory with necessary environment variables defined (seesample.env
), and execute any of the scripts. (Note this means any new scripts added should support thedotenv
convention.) - You can also launch the specific Docker container version (or even the AWS ECS Task), with proper environment variables defined, against a local or testing instance.
- The scripts should use
@trade-platform/log
to log output, and you can adjust theLOG_LEVEL
environment variable to get more (or less) information. - The logs generated by the scripts should end up in CloudWatch and then
LogDNA, and each entry should include a unique
requestid
and other values to assist with tracing. (Note this means any scripts added should support putting AIX_REQUEST_ID from the environment into all log messages.) - Script output should also end up in the Activity object associated to the
Feed in
event-service
, so each invocation of a Task should have script output available for debugging. - Scripts should be designed to be run multiple times with the same data and not create problems. (idempotent)[https://en.wikipedia.org/wiki/Idempotence]
Building the Docker Image
docker build -t event-handler-tasks:latest .
I want to...
...add a new client
- Figure out a "short name" for the client -- preferably one that is used everywhere that client is referred to.
- Create a directory under
src/tasks/
for that "short name". - Add the processing script(s) to
src/tasks/<short name>
. - Update
sbin/runTasks.sh
to add a unique string to theswitch
statement. This string should start with the "short name", and the body of thecase
should execute the script you created (along with any command-line parameters). - Make sure there is a Feed for the client defined in
event-service
. You can use PostMan/Insomnia/etc to access the API, or you can use thees
CLI from@trade-platform/event-service-lib
. - Add a Trigger to the Feed that uses the
latest
tag on the (TBD: Docker image name) image with the unique string as a command-line argument to therunTask.sh
script. - Test by crafting an event that will match the Trigger and sending that to the
event-service
API (again, using PostMan/etc or thees
CLI). You should be able to find output logs in CloudWatch, LogDNA, and/or by using the API to request Activity objects on the Feed.
...write a basic processing script
- In
src/tasks/<short name>/
add a TypeScript file that has aclass
thatextends Application
, andimport { Application } from @/utils/Application
- Implement the
run
method and have the script do what you need. Be sure to explore the methods available in theApplication
base class for getting a database handle, anAIXEventClient
object, sending content to SFTP or S3, etc. - Test by creating a
.env
file with the values you need to have set. You'll need to manually craft all of the environment variables normally set byevent-service
andrunTask.sh
. The execute your script viats-node script
.
Examples: see anything in src/tasks/fsis
or src/tasks/ifpartners
.
...write a script that line-by-line processes a file uploaded via SFTP
- Extend
ReadlineApplication
instead ofApplication
, then implement theonLine
andonClose
methods instead ofrun
. TheonLine
method will be called once for each line in the file, and thenonClose
called once all of the lines have been processed. - To test, your
.env
will need anAIX_TRIGGERING_EVENT
that has a stringifiedS3EventModel
(from@trade-platform/event-service-lib
) that points to a file in S3. Make sure while testing that your general AWS CLI environment is set correctly (vaAWS_DEFAULT_PROFILE
orAWS_ACCESS_KEY
, etc.)
Example: see src/tasks/fsis/friends_and_family_generate_accounts.ts
...write a script that processes a CSV file uploaded via SFTP
- Extend
CsvApplication
instead ofApplication
, then implement theonRecord
andonClose
methods instead ofrun
. TheonRecord
method will be called once for each record in the file, and thenonClose
called once all of the records have been processed. - To test, your
.env
will need anAIX_TRIGGERING_EVENT
that has a stringifiedS3EventModel
(from@trade-platform/event-service-lib
) that points to a file in S3. Make sure while testing that your general AWS CLI environment is set correctly (vaAWS_DEFAULT_PROFILE
orAWS_ACCESS_KEY
, etc.)
Example: see src/utils/CsvUploaderApplication.ts
and how it implements
onRecord
and onClose
.
...write a script that uploads a CSV file directly into MySQL
- Extend
CsvUploaderApplication
instead ofApplication
, then implement thetableName
anddataSpec
properties. - The
tableName
property should be a string that is the name of the table to create/update in MySQL. Stick to a nameing convention where the "short name" of the client is the first part of the table name (ie:fsis_accounts
orifpartners_owners
). - The
dataSpec
property is an array ofDataSpec
objects, one per field in the CSV record (or extra calculated field).- The
csv
value is the field name in the CSV record - The
db
value is the field name as you want it in the database table - The
type
values is the SQL datatype (excluding modifiers likeNULL
orPRIMARY KEY
) - The
key
value is a boolean flag to indicate the (composite) primary key - The
virtual
flag indicates if the field appears in the database but not the CSV file (a constant, calculated or derived value, for example) value
is an optional function used to set a constant, calculate a value, etc.
- The
- To test, your
.env
will need anAIX_TRIGGERING_EVENT
that has a stringifiedS3EventModel
(from@trade-platform/event-service-lib
) that points to a file in S3. Make sure while testing that your general AWS CLI environment is set correctly (vaAWS_DEFAULT_PROFILE
orAWS_ACCESS_KEY
, etc.)
Note: The framework is implemented to favor large file sizes, rather than being particularly fast.
Examples: see src/tasks/ifpartners/upload-*.ts
...split my processing across several scripts
- To have one processing script call another, use the
spawnSubTask
method found in theApplication
base class. - This method takes an environment override, so you can pass a different Event
using the
AIX_TRIGGERING_EVENT
environment variable. - This method also takes an array of strings to pass to the
runTask.sh
script, which means you may need to modifyrunTask.sh
to add a newcase
to theswitch
statement.
Examples: see src/tasks/ifpartners/upload-account-owner.ts
and
src/tasks/ifpartners/export-contacts.ts
.
...send a file somewhere via SFTP
- Use the
sendToSFTP
method of theApplication
base class.
Examples: see src/tasks/ifpartners/export-contacts.ts
...upload a file to an S3 bucket
- Use the
sendToS3
method of theApplication
base class.
Examples: see src/tasks/fsis/friends_and_family_accounts_report.ts
Tasks
FSIS / FS Investments
Friends and Family
IFP / IFPartners
NOTE
Make sure your PR titles are semrel formatted when you squash merge.
vim: tw=80