npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

llama-cli

v1.0.0

Published

Chaos Llama is a small tool for testing resiliency and recoverability of AWS-based architectures.

Downloads

10

Readme

                   V
                  /'>>>
                 /*/
                / /
               /*/
              / /
      -------/*/   _____ _____ _____ _____ _____    __    __    _____ _____ _____
   --/  *  * */   |     |  |  |  _  |     |   __|  |  |  |  |  |  _  |     |  _  |
    /* * *  */    |   --|     |     |  |  |__   |  |  |__|  |__|     | | | |     |
    -  --- -/     |_____|__|__|__|__|_____|_____|  |_____|_____|__|__|_|_|_|__|__|
     H    H
     H    H
     --   --

Meet Chaos Llama

Chaos Llama is a small tool for testing resiliency and recoverability of AWS-based architectures. Once configured and deployed, it will randomly terminate or otherwise interfere* with the operation of your EC2 instances and ECS tasks. It is inspired by Netflix's Chaos Monkey. Think of it as Chaos Monkey rebuilt with 2016 tech.

Installation

npm install -g llama-cli

Setting Up

AWS Configuration

An IAM user and a role for the lambda need to be set up first.

IAM User

Must be set up and credentials set up in ~/.aws/credentials

Lambda Role

Required policies:

  • AmazonEC2FullAccess

Setting up Chaos Llama

To create the AWS Lambda function run:

llama deploy -r $lambda-role-arn

This will create a state file (llama_config.json) which is needed for subsequent re-deploys, and deploy Chaos Llama to AWS. Llama will be configured to run once an hour, but it won't do anything every time it runs.

To configure termination rules, run deploy with a Llamafile:

llama deploy -c Llamafile.json

Llamafile.json

Example Llamafile:

{
  "interval": "60",
  "enableForASGs": [
  ],
  "disableForASGs": [
  ]
}

Options:

  • interval (in minutes) - how frequently Chaos Llama should run. Minimum value is 5. Default value is 60.
  • enableForASGs - whitelist of names of ASGs to pick an instance from. Instances in other ASGs will be left alone. Empty list ([]) means Chaos Llama won't do anything.
  • disableForASGs - names of ASGs that should not be touched; instances in any other ASG are eligible for termination.

If both enableForASGs and disableForASGs are specified, then only enableForASGs rules are applied.

Chaos Llama vs Chaos Monkey

Chaos Llama is inspired by Netflix’s Chaos Monkey. Curious about the differences? Here’s a handy summary:

| Llama | Monkey | |:-------------|:-----| | Serverless (runs on AWS Lambda) - no maintenance | Needs EC2 instances to run on | | Extremely easy to deploy | Needs quite a bit of setup and config (»»») | | Small codebase, easy to understand and extend (<400 SLOC) | Large codebase (thousands of SLOC) | | Written in JS | Written in Java | | New on the scene | Mature project | | Small featureset | Many features | | ECS support in the works | Does not support ECS | | Open source under MPL 2.0 / MIT | Open source under APL 2.0 | | Developed by Shoreditch Ops | Developed by Netflix |

Why Use Chaos Llama?

Failures happen, and they inevitably happen when least desired. If your application can't tolerate a system failure would you rather find out by being paged at 3am or after you are in the office having already had your morning coffee? Even if you are confident that your architecture can tolerate a system failure, are you sure it will still be able to next week, how about next month? Software is complex and dynamic, that "simple fix" you put in place last week could have undesired consequences. Do your traffic load balancers correctly detect and route requests around system failures? Can you reliably rebuild your systems? Perhaps an engineer "quick patched" a live system last week and forgot to commit the changes to your source repository?

(source: Chaos Monkey wiki)

Further reading: Principles Of Chaos Engineering

Current Limitations

Supported AWS Regions

Chaos Llama will only work in these regions (due to a limitation with AWS Lambda Schedules):

  • US East (Northern Virginia)
  • US West (Oregon)
  • Europe (Ireland)
  • Asia Pacific (Tokyo)

Features

Right now, Chaos Llama only knows how to terminate instances and does not support more advanced interference modes, like introducing extra latency.

License

MPL 2.0 - see LICENSE.txt for details.

The lambda/index.js file is dual-licensed under MPL 2.0 and MIT and can be used under the terms of either of those licenses.


A project by Shoreditch Ops, creators of artillery.io - simple & powerful load-testing with Node.js