npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@reason.co/pipelines-support

v1.5.1

Published

Provides build scripts for use with pipelines

Downloads

3

Readme

██████╗ ███████╗ █████╗ ███████╗ ██████╗ ███╗   ██╗    ██████╗ ██████╗ 
██╔══██╗██╔════╝██╔══██╗██╔════╝██╔═══██╗████╗  ██║   ██╔════╝██╔═══██╗
██████╔╝█████╗  ███████║███████╗██║   ██║██╔██╗ ██║   ██║     ██║   ██║
██╔══██╗██╔══╝  ██╔══██║╚════██║██║   ██║██║╚██╗██║   ██║     ██║   ██║
██║  ██║███████╗██║  ██║███████║╚██████╔╝██║ ╚████║██╗╚██████╗╚██████╔╝
╚═╝  ╚═╝╚══════╝╚═╝  ╚═╝╚══════╝ ╚═════╝ ╚═╝  ╚═══╝╚═╝ ╚═════╝ ╚═════╝ 
                                                                       
Pipeline Support Library (c) 2020 With Reason Ltd

-----------------------------------------------------------------------------

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
IN THE SOFTWARE.

----------------------------------------------------------------------------- 

Pipeline Support scripts

When running a project that has multiple sub services living under a "mono repo" structure, this tooling will automatically parse subfolders within the service directory and envoke the relevant test or deploy scripts.

service/
├── s3-or-azure-service/
│   ├── [project files...]
│   └── pacakage.json
└── serverless-service/
    ├── [project files...]
    └── serverless.yaml

Once correctly configured, this will automatically deploy based on folder structure within your project and recent changes based on the set commit, this also checks yaml and json files to determine the correct deployment type, such as s3 or serverless.

The below readme uses bitbucket pipelines as the example source, but the same commands can be run in any pipleines

Environment variables

Credentials should be stored in "secured" environment variables in the Pipeline settings.

For Bitbucket, you can edit these in Repository Settings > Pipelines > Repository variables.

AWS

To deploy to AWS ensure the SERVICE_ENV , AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY EVs are available.

Azure

To deploy a zip bundle to Azure scm, you will need the AZURE_DEPLOY_USER and AZURE_DEPLOY_PASS credentials populated with a service account.

Adding deployments to your project

In your pipeline, add the deployer:

Example for Bitbucket Pipelines (in RemoteDev):

pipelines:
  branches:
    master:
      - step:
          script:
            - export SERVICE_ENV=remotedev
            - export AWS_ACCESS_KEY_ID=$REMOTEDEV_AWS_ACCESS_KEY_ID
            - export AWS_SECRET_ACCESS_KEY=$REMOTEDEV_AWS_SECRET_ACCESS_KEY
            - node ./node_modules/@reason.co/pipelines-support/index.js test $BITBUCKET_COMMIT remotedev
            - node ./node_modules/@reason.co/pipelines-support/index.js deploy $BITBUCKET_COMMIT remotedev

Per the above, ensure SERVICE_ENV , AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY are available.

When invoking bitbucket-pipelines-support please specify the following arguments (in order):

  • Job (test or deploy)
  • Commit that is used to check for file changes ($BITBUCKET_COMMIT is exposed by default by Bitbucket pipelines, this will differ depending on the service you are using )
  • Stage to apply (such as remotedev)

S3 static website deployments

Ensure your local package.json has a "s3-deployment" node

"s3-deployment": {
    "remotedev": {
      "region": "eu-west-1",
      "bucket": "fqdn.bucketname-remotedev.com"
    },
    "staging": {
      "region": "eu-west-1",
      "bucket": "fqdn.bucketname-staging.com"
    },
    "production": {
      "region": "eu-west-1",
      "bucket": "fqdn.bucketname-production.com"
    }
  },

This will match based on the Stage specified when executing the script (such as remotedev) and pick the corresponding bucket for your stage.

This will automatically create some rules on your bucket, but some need to added manually and double checked if needed.

Bucket Policy

Either in the GUI or via Terraform if applicable, ensure your bucket policy (in permissions) allows s3:getObject per:

{
    "Version": "2012-10-17",
    "Id": "REASONDEPLOYPOLICY",
    "Statement": [
        {
            "Sid": "Stmt1508317523362",
            "Effect": "Allow",
            "Principal": "*",
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::your-bucket-name-here/*"
        }
    ]
}

If using Terraform, please add the following (this assumes you have a aws_s3_bucket.www resource):

resource "aws_s3_bucket_policy" "policy" {
  bucket = "${aws_s3_bucket.www.id}"
  policy = <<POLICY
{
    "Version": "2012-10-17",
    "Id": "REASONDEPLOYPOLICY",
    "Statement": [
        {
            "Sid": "Stmt1508317523362",
            "Effect": "Allow",
            "Principal": "*",
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::${aws_s3_bucket.www.id}/*"
        }
    ]
}
POLICY
}

Static website Hosting Configuration

Also ensure that "Static website Hosting" (in Permissions) is enabled and that the default index is set to your index.html or equivalent.

If using Terraform, please add the following:

resource "aws_s3_bucket" "www" {
  bucket = var.bucket
  acl    = "public-read"
  website {
    index_document = "index.html"
    error_document = "error.html"
  }
}

Azure ZIP deployments

To deploy a zip bundle to Azure scm, you will need the AZURE_DEPLOY_USER and AZURE_DEPLOY_PASS credentials populated with a service account.

This does not auto create services, so you will need an existing service created before you push to it.

Ensure your local package.json has a azure-push-deployment node

"azure-push-deployment": {
    "remotedev": {
      "service-name": "the-name-of-your-azure-service"
    }
  },

Serverless deployments

If a Serverless YAML file is detected, this will trigger a serverless deployment after running a npm install.

If a stage is specified, this will prefix the serverless job with a --stage STAGE_HERE

Whilst this guide specifies AWS environment variables, any other PaaS supported by serverless framework as long as the correct environment variables are supplied.

Other Job Types

Migrate DB

node ./node_modules/@reason.co/pipelines-support/index.js migrate-db $BITBUCKET_COMMIT remotedev

Will run the NPM script migrate-db on any pacakge.json files in service sub folders.

Test

node ./node_modules/@reason.co/pipelines-support/index.js test $BITBUCKET_COMMIT remotedev

Will run the NPM script test on any pacakge.json files in service sub folders.

Other deploy

node ./node_modules/@reason.co/pipelines-support/index.js deploy $BITBUCKET_COMMIT remotedev

If there is a deploy script in the package.json this will be run.

Note that this would also trigger the s3/azure deploys if the relevant properties were populated in package.json

Scripts ran by this task are (in order): npm install -q, npm run build (If job "build" exists), npm run deploy.