npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

ea-aws-service-connectors

v2.0.1

Published

A microservice related connector for Amazon Web Services using the AWS SDK for Javascript

Downloads

2

Readme

EA AWS Service Connectors

SYNOPSIS

The purpose of these Amazon Web Service connectors is to provide an easy and reusable way for Node.js microservices to utilise AWS services using common configuration as far as possible. The following functionality is provided currently:

  • Calling APIs deployed to AWS API gateway
  • Basic creation and deletion of DynamoDb tables and global secondary indexes.
  • Creation and deletion of local secondary indexes is not supported currently.
  • Global secondary indexes are limited to projecting all attributes currently.
  • Basic CRUD operations for DynamoDb tables.
  • Basic support for sending messages to Amazon SQS queues.
  • This connector is designed to work with the EA SQS Queue Receiver Service.
  • Basic S3 upload and retrieval.

This module can be used independently or as a dependency of microservices spawned by the EA Services Controller module.

Prerequisites

Node.js v4.x.x or above

NPM based installation

npm i --save ea-aws-service-connectors

Configuration

Introduction

The module uses the following JSON configuration flles:

  • aws.json - This file is used to configure details of the AWS account containing resources to be utilised.
  • databases.json - This file is used to configure details of available Amazon DynamoDb instances
  • queues.json - This file is used to configure the queue receivers and SQS queues.
  • service-endpoints.json - This file is used to configure deails of service API endpoints deployed to Amazon API Gateway.

Environment variables

Substitution in configuration files

All configuration file values can be set using environment variable substitution. Environment variable substitution is specified through the use of a double quote delimited string containing the environment variable name enclosed in angled brackets; for example "<<AWS_REGION>>". Multiple environment variables substitutions within the same string are supported; for example, "<<QUEUE_PREFIX>>sample_queue<<QUEUE_SUFFIX>>".

Multi-environment deployments

This module supports multi-environment deployments through the following pair of environment variables:

  • TARGET_ENV_NAME - This environment variable should be set to the name of the environment to which the module is deployed; for example, dev.
  • AVAILABLE_TARGET_ENV_NAMES - This environment variable should be set to a JSON compatible array of the environment names to which the module can be deployed. The environment variable name MUST be enclosed in single quotes. For example: '["dev","test","prod"]'.

This pair of environment variables must be used together or not at all. If the value of TARGET_ENV_NAME is not contained within the value of AVAILABLE_TARGET_ENV_NAMES. the module will shutdown. TARGET_ENV_NAME can be used in configuration files to provide consistent naming conventions between environments. For example, queue names can be prefixed with TARGET_ENV_NAME.

Configuration file location

The JSON configuration files MUST be placed in the same directory as each other.

  • If the module is used as a dependency of microservices spawned by the EA Services Controller module, it is recommended that all configuration files should be placed in the config directory of the EA Services Controller module.
  • In this scenario the environment variable SERVICES_CONFIG_LOCATION MUST be set to the path of the chosen directory.
  • If the module is used independently, the module assumes that the JSON configuration files are located in the config directory of this module by default.
  • In this scenario, if you place the JSON configuration files in a different directory the environment variable SERVICES_CONFIG_LOCATION MUST be set to the path of the chosen directory.

Configuration file validation

All configuration files used by the module are validated at startup against Joi schemas available in the EA Service Schemas repository. If a configuration file fails validation, the module will shutdown.

Configuration file samples

aws.json

This file contains a subset of options used to provide access to AWS services through the AWS SDK for Javascript.

  aws": {
    "accessKeyId" :"<<AWS_ACCESS_KEY_ID>>",
    "secretAccessKey":"<<AWS_SECRET_ACCESS_KEY>>",
    "accountId": "<<AWS_ACCOUNT_ID>>",
    "sessionToken": "<<AWS_SESSION_TOKEN>>",
    "region": "<<AWS_REGION>>"
    // The maximum amount of retries to perform for a service request.
    "maxRetries": 20,
    "retryDelayOptions": {
      // The base number of milliseconds to use in the exponential backoff for operation retries.
      "base": 100
    }    
  }

databases.json

This file contains details of available Amazon DynamoDb instances.

{
  "databases": {
    // The identifier microservices must use to access the database details.
    "sampleDynamoDbConfig": {
      "dynamoDb": {
        // Optional override of the default DynamoDb endpoint.
        "endpoint": "dynamodb.eu-west-1.amazonaws.com",
        // Optional override of the default DynamoDb region.
        "region": "eu-west-1",
        // The DynamoDb table name.
        "tableName": "<< TARGET_ENV_NAME >>_sample_table",
        // Optional table definition that is only needed if the table needs to be created/deleted at runtime.
        // The definition structure follows the standard AWS structure is closely as possible, but the leading character of each key is lower case rather than upper case.
        // This maintains consistency with the rest of the configuration file.
        "tableDefinition" : {
          "attributeDefinitions": [
            {
              "attributeName": "id",
              "attributeType": "S"
            },
            {
              "attributeName": "sampleAttribute",
              "attributeType": "S"
            }
          ],
          "keySchema": [
            {
              "attributeName": "id",
              "keyType": "HASH"
            }
          ],
          "provisionedThroughput": {
            "readCapacityUnits": 5,
            "writeCapacityUnits": 5
          },
          "globalSecondaryIndexes": [
            {
              "indexName": "sampleAttributeIndex",
              "keySchema": [
                {
                  "attributeName": "sampleAttribute",
                  "keyType": "HASH"
                }
              ],
              "projection": {
                "projectionType": "ALL"
              },
              "provisionedThroughput": {
                "readCapacityUnits": 5,
                "writeCapacityUnits": 5
              }
            }
          ]
        }
      }
    }, ...................... continue block for each DynamoDb instance.
}

service-endpoints.json

This file contains service API endpoint definitions

  "serviceEndpoints": {
    // The name of the service endpoint definition.  
    "sample": {
          // The service API URL.
          "endpointUrl": "<<SAMPLE_API_URL>>/<<TARGET_ENV_NAME>>/sample",
          // The API Gateway key
          "awsApiGateWayAuthorizationToken": "<<SAMPLE_API_KEY>>",
          // The name of an optional HTTP GET request parameter to be passed to service API endpoint calls. See the *Current limitations* section for further details.
          "endpointParam": "id",
          // Number of times to try calling the service API endpoint before placing the message on the error queue.
          "endpointRetries": 1,
          // Timeout (in milliseconds) to be used when calling the service endpoint API.
          "endpointTimeOut": 10000,
          // The amount of time to wait (in milliseconds) before trying to call the service endpoint API after a failed call.
          "endpointRetryWaitMillisecs":1000
    }, ...................... continue block for each endpoint
  }

queues.json

This file contains an array of queue definitions. It is expected that the configuration file will contain two queue definitions per service endpoint API (one to process messages bound for the service API endpoint and one to receive messages that failed to be processed by the service API endpoint). An example configuration fragment is shown below:

    "queues": [
     {
        // Name of the queue as seen in PM2 - note this isn't the actual queue name
       "name": "sample_queue",
       // Path to the queue receiver implementation to run
       "script": "./lib/services/sqs-queue-receiver.js",
       // The actual queue name to start receiving messages from
       "queueName": "<<TARGET_ENV_NAME>>_sample_queue",
       // The AWS Region name
       "region": "<<AWS_REGION>>",
       // Boolean debug flag for individual queue - console log initialised if true
       "debug": false,
       // Maximum number of messages to read at once - MIN 1 - MAX 4
       "maxNumberOfMessages": 4,
       // Concurrency will spawn cluster instances of the queue receiver
       "concurrency": 1,
       // Integer duration (in seconds) for which the call will wait for a message to arrive in the queue before returning. If a message is available, the call will return sooner than WaitTimeSeconds. Default is 20
       "waitTimeSeconds" : 10,
       // Maximum number of messages to return. Amazon SQS never returns more messages than this value but may return fewer. Default is 1
       "visibilityTimeout" : 30,
       // Integer duration (in seconds) that messages placed on the queue will be unavailable for processing initially.
       "messageDelaySecs": 1
       // PM2 log file location
       "out_file": "../ea-sqs-queue-receiver-service/logs/sample_queue/stndout.log"
       // PM2 error log location
       "error_file": "../ea-sqs-queue-receiver-service//logs/sample_queue/error_out.log"
       // The name of the endpoint service (as defined in service-endpoints.json) to be called when a message is received
       "serviceEndpoint": "sample",
       // A relative URL to be appended to the service API URL to form a service API endpoint.
       "serviceURI": "/pickuptaskfromqueue",
       // The HTTP method used to call the service endpoint API (get and post are supported currently. The default is get if the method is unspecified).
       method: 'post',
       // The max memory restart setting for PM2
       "maxMemoryRestart": "100M",
       // The name of the error queue to send failed messages to for replay.
       "errorQueueName": "sample_queue_replay
     },
     // The error queue definition associated with sample_queue.
     {
       "name": "sample_queue_replay",
       "script": "./lib/services/sqs-queue-receiver.js",
       "queueName": "<<TARGET_ENV_NAME>>_sample_queue_replay",
       "debug": false,
       "maxNumberOfMessages": 1,
       "concurrency": 1,
       "waitTimeSeconds": 3,
       "visibilityTimeout": 60,
       // Replaying of messages placed on the error queue is delayed for a period of time in case the service API endpoint is unable to process the message temporarily.
       "messageDelaySecs": 180,
       "out_file": "../ea-sqs-queue-receiver-service/logs/sample_queue_replay/stndout.log",
       "error_file": "../ea-sqs-queue-receiver-service/logs/sample_queue_replay/error_out.log",
       // Messages on the error queue are not associated with a service API endpoint.
       "serviceEndpoint": null,
       "serviceURI": null,
       "method": "post",
       "maxMemoryRestart": "100M",
       // An error queue should not transfer messages to another error queue.
       "errorQueueName": null,
       // The  queue that messages should be replayed to after the configured delay.
       "replayToQueue": "sample_queue"
     }
     ...................... continue block for each pair of queues
   }

Coding examples for using the connectors

Dynamo DB Database examples

// Initialize the connector
var connectors = require('ea-aws-service-connectors')

// Create a DynamoDb database table based on the given config
var dbDefinition = new connectors.DynamoDbDefinition('sampleDynamoDbConfig')
dbDefinition.create(function (err, result) {...})
// Wait for the table to be created.
...
// Describe the table
dbDefinition.describe(function (err, result) {...})

// Prepare to access a DynamoDb database table based on the given config
var db = new connectors.DynamoDb('sampleDynamoDbConfig')

// Retrieve an iten in the database table by hash key.
db.get(id, function (err, result) {...})

// Create a new item in the database table using payload
db.create(payload, function (err, result) {...})

// Update an item within the database table using payload
// @param {Object} payload to update in the db this should have the id contained within
db.update(payload, function (err, result) {...})

// Run a query and return an array of json items back from the database based on the query params
// @param {Object} query json object key values to get the items back from the database
db.query(query, function (err, result) {...})

// Perform a full scan of the database table from the specified exclusive start key onwards.
db.fullScan(exclusiveStartKey, function (err, result) {...})

// Query a global index by hash key.
db.queryGlobalIndexByHashKey(indexName, hashKeyValue, function (err, result) {...}) 

// Delete the item identified by the specified hash key from the database table. 
db.delete(id, function (err, result) {...})

// Delete the table.
dbDefinition.delete(function (err, result) {...})

SQS Queue example

// Initialize the connector
var connectors = require('ea-aws-service-connectors')

//Initialize the queue handler
var queue = new connectors.Queue()

// Send a message to the specified queue.
// IMPORTANT - The queue to send messages to MUST be specified using the queue definition name within queues.json.  This is NOT the actual queue name.
// Please see the the queues.json sample configuration for further details.
queue.send(message, 'sample_queue', function (err, result) {})

API Gateway example

// Initialize the connector
var connectors = require('ea-aws-service-connectors')

//Initialize the api handler
var api = new connectors.ApiServiceCall()

// Call the api gateway using a HTTP GET request to a defined URI
// IMPORTANT - The  API to call MUST be identified using the service endpoint definition name within service-endpoints.json.
// Please see the the service-endpoints.json sample configuration for further details.
api.get('sample', uri, function (errs, results) {})

S3 example

// Initialize the connector
var connectors = require('ea-aws-service-connectors')

// Initialize the S3 connector for a specific S3 bucket.
var s3Bucket = awsConnectors.S3(process.env.S3_BUCKET_NAME)

// Put an item in the bucket using the specified key
s3Bucket.put(key, item, function (errs, results) {})

// Retrieve the item associated with the specified key from the bucket.
s3Bucket.get(key, function (errs, results) {})

// Retrieve the list of items for the specified key from the bucket.
s3Bucket.list(key, function (errs, results) {})

// Retrieve the list of items for the specified key from the bucket.
s3Bucket.list(key, function (errs, results) {})

// Move an item from source to target location.
s3Bucket.move(source, target, function (errs, results) {})

Contributing to this project

If you have an idea you'd like to contribute please log an issue. All contributions should be submitted via a pull request.

License

THIS INFORMATION IS LICENSED UNDER THE CONDITIONS OF THE OPEN GOVERNMENT LICENCE found at:

http://www.nationalarchives.gov.uk/doc/open-government-licence/version/3

The following attribution statement MUST be cited in your products and applications when using this information.

Contains public sector information licensed under the Open Government license v3

About the license

The Open Government Licence (OGL) was developed by the Controller of Her Majesty's Stationery Office (HMSO) to enable information providers in the public sector to license the use and re-use of their information under a common open licence.

It is designed to encourage use and re-use of information freely and flexibly, with only a few conditions.