npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

lighthouse-check

v0.0.3

Published

Forked version of foo-software/lighthouse-check with multiple runs added.

Downloads

14

Readme

CircleCI

@foo-software/lighthouse-check

An NPM module and CLI to run Lighthouse audits programmatically. This project aims to add bells and whistles to automated Lighthouse testing for DevOps workflows. Easily implement in your Continuous Integration or Continuous Delivery pipeline.

This project provides two ways of running audits - locally in your own environment or remotely via Automated Lighthouse Check API. For basic usage, running locally will suffice, but if you'd like to maintain a historical record of Lighthouse audits and utilize other features, you can run audits remotely by following the steps and examples.

Features

Table of Contents

Install

npm install @foo-software/lighthouse-check

Usage

@foo-software/lighthouse-check provides several functionalities beyond standard Lighthouse audits. It's recommended to start with a basic implementation and expand on it as needed.

Basic Usage

Calling lighthouseCheck will run Lighthouse audits against https://www.foo.software and https://www.foo.software/contact.

import { lighthouseCheck } from '@foo-software/lighthouse-check';

(async () => {
  const response = await lighthouseCheck({
    urls: [
      'https://www.foo.software',
      'https://www.foo.software/contact'
    ]
  });

  console.log('response', response);
})();

Or via CLI.

$ lighthouse-check --urls "https://www.foo.software,https://www.foo.software/contact"

The CLI will log the results.

Automated Lighthouse Check API Usage

Automated Lighthouse Check can monitor your website's quality by running audits automatically! It can provide a historical record of audits over time to track progression and degradation of website quality. Create a free account to get started. With this, not only will you have automatic audits, but also any that you trigger additionally. Below are steps to trigger audits on URLs that you've created in your account.

Trigger Audits on All Pages in an Account

Basic example with the CLI

$ lighthouse-check --apiToken "abcdefg"

Trigger Audits on Only Certain Pages in an Account

  • Navigate to your account details, click into "Account Management" and make note of the "API Token".
  • Navigate to your dashboard and once you've created URLs to monitor, click on the "More" link of the URL you'd like to use. From the URL details screen, click the "Edit" link at the top of the page. You should see an "API Token" on this page. It represents the token for this specific page (not to be confused with an account API token).
  • Use the account token as the apiToken option and page token (or group of page tokens) as urls option.

Basic example with the CLI

$ lighthouse-check --apiToken "abcdefg" \
  --urls "hijklmnop,qrstuv"

You can combine usage with other options for a more advanced setup. Example below.

Runs audits remotely and posts results as comments in a PR

$ lighthouse-check --apiToken "abcdefg" \
  --urls "hijklmnop,qrstuv" \
  --prCommentAccessToken "abcpersonaltoken" \
  --prCommentUrl "https://api.github.com/repos/foo-software/lighthouse-check/pulls/3/reviews"

Saving Reports Locally

You may notice above we had two lines of output; Report and Local Report. These values are populated when options are provided to save the report locally and to S3. These options are not required and can be used together or alone.

Saving a report locally example below.

import { lighthouseCheck } from '@foo-software/lighthouse-check';

(async () => {
  const response = await lighthouseCheck({
    // relative to the file. NOTE: when using the CLI `--outputDirectory` is relative
    // to where the command is being run from.
    outputDirectory: '../artifacts',
    urls: [
      'https://www.foo.software',
      'https://www.foo.software/contact'
    ]
  });

  console.log('response', response);
})();

Or via CLI.

$ lighthouse-check --urls "https://www.foo.software,https://www.foo.software/contact" \
  --ouputDirectory "./artifacts"

Saving Reports to S3

import { lighthouseCheck } from '@foo-software/lighthouse-check';

(async () => {
  const response = await lighthouseCheck({
    awsAccessKeyId: 'abc123',
    awsBucket: 'my-bucket',
    awsRegion: 'us-east-1',
    awsSecretAccessKey: 'def456',
    urls: [
      'https://www.foo.software',
      'https://www.foo.software/contact'
    ]
  });

  console.log('response', response);
})();

Or via CLI.

$ lighthouse-check --urls "https://www.foo.software,https://www.foo.software/contact" \
  --awsAccessKeyId abc123 \
  --awsBucket my-bucket \
  --awsRegion us-east-1 \
  --awsSecretAccessKey def456 \

Implementing with Slack

Below is a basic Slack implementation. To see how you can accomplish notifications with code versioning data - see the CircleCI example (ie GitHub authors, PRs, branches, etc).

import { lighthouseCheck } from '@foo-software/lighthouse-check';

(async () => {
  const response = await lighthouseCheck({
    slackWebhookUrl: 'https://www.my-slack-webhook-url.com'
    urls: [
      'https://www.foo.software',
      'https://www.foo.software/contact'
    ]
  });

  console.log('response', response);
})();

Or via CLI.

$ lighthouse-check --urls "https://www.foo.software,https://www.foo.software/contact" \
  --slackWebhookUrl "https://www.my-slack-webhook-url.com"

The below screenshot shows an advanced implementation as detailed in the CircleCI example.

Enabling PR Comments

Populate prCommentAccessToken and prCommentUrl options to enable comments on pull requests.

Enforcing Minimum Scores

You can use validateStatus to enforce minimum scores. This could be handy in a DevOps workflow for example.

import { lighthouseCheck, validateStatus } from '@foo-software/lighthouse-check';

(async () => {
  try {
    const response = await lighthouseCheck({
      awsAccessKeyId: 'abc123',
      awsBucket: 'my-bucket',
      awsRegion: 'us-east-1',
      awsSecretAccessKey: 'def456',
      urls: [
        'https://www.foo.software',
        'https://www.foo.software/contact'
      ]
    });

    const status = await validateStatus({
      minAccessibilityScore: 90,
      minBestPracticesScore: 90,
      minPerformanceScore: 70,
      minProgressiveWebAppScore: 70,
      minSeoScore: 80,
      results: response
    });

    console.log('all good?', status); // 'all good? true'
  } catch (error) {
    console.log('error', error.message);

    // log would look like:
    // Minimum score requirements failed:
    // https://www.foo.software: Performance: minimum score: 70, actual score: 64
    // https://www.foo.software/contact: Performance: minimum score: 70, actual score: 44
  }
})();

Or via CLI. Important: outputDirectory value must be defined and the same in both commands.

$ lighthouse-check --urls "https://www.foo.software,https://www.foo.software/contact" \
  --outputDirectory /tmp/artifacts \
$ lighthouse-check-status --outputDirectory /tmp/artifacts \
  --minAccessibilityScore 90 \
  --minBestPracticesScore 90 \
  --minPerformanceScore 70 \
  --minProgressiveWebAppScore 70 \
  --minSeoScore 80

Implementing with CircleCI

In the below example we run Lighthouse audits on two URLs, save reports as artifacts, deploy reports to S3 and send a Slack notification with GitHub info. We defined environment variables like LIGHTHOUSE_CHECK_AWS_BUCKET in the CircleCI project settings.

This implementation utilizes a CircleCI Orb - lighthouse-check-orb.

version: 2.1

orbs:
  lighthouse-check: foo-software/[email protected] # ideally later :)

jobs:
  test: 
    executor: lighthouse-check/default
    steps:
      - lighthouse-check/audit:
          urls: https://www.foo.software,https://www.foo.software/contact
          # this serves as an example, however if the below environment variables
          # are set - the below params aren't even necessary. for example - if
          # LIGHTHOUSE_CHECK_AWS_ACCESS_KEY_ID is already set - you don't need
          # the line below.
          awsAccessKeyId: $LIGHTHOUSE_CHECK_AWS_ACCESS_KEY_ID
          awsBucket: $LIGHTHOUSE_CHECK_AWS_BUCKET
          awsRegion: $LIGHTHOUSE_CHECK_AWS_REGION
          awsSecretAccessKey: $LIGHTHOUSE_CHECK_AWS_SECRET_ACCESS_KEY
          slackWebhookUrl: $LIGHTHOUSE_CHECK_SLACK_WEBHOOK_URL

workflows:
  test:
    jobs:
      - test

Reports are saved as "artifacts".

Upon clicking the HTML file artifacts, we can see the full report!

In the example above we also uploaded reports to S3. Why would we do this? If we want to persist historical data - we don't want to rely on temporary cloud storage.

Implementing with GitHub Actions

Similar to the CircleCI implementation, we can also create a workflow implementation with GitHub Actions using lighthouse-check-action. Example below.

.github/workflows/test.yml

name: Test Lighthouse Check
on: [push]

jobs:
  lighthouse-check:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@master
      - run: mkdir /tmp/artifacts
      - name: Run Lighthouse
        uses: foo-software/lighthouse-check-action@master
        with:
          accessToken: ${{ secrets.LIGHTHOUSE_CHECK_GITHUB_ACCESS_TOKEN }}
          author: ${{ github.actor }}
          awsAccessKeyId: ${{ secrets.LIGHTHOUSE_CHECK_AWS_ACCESS_KEY_ID }}
          awsBucket: ${{ secrets.LIGHTHOUSE_CHECK_AWS_BUCKET }}
          awsRegion: ${{ secrets.LIGHTHOUSE_CHECK_AWS_REGION }}
          awsSecretAccessKey: ${{ secrets.LIGHTHOUSE_CHECK_AWS_SECRET_ACCESS_KEY }}
          branch: ${{ github.ref }}
          outputDirectory: /tmp/artifacts
          urls: 'https://www.foo.software,https://www.foo.software/contact'
          sha: ${{ github.sha }}
          slackWebhookUrl: ${{ secrets.LIGHTHOUSE_CHECK_WEBHOOK_URL }}
      - name: Upload artifacts
        uses: actions/upload-artifact@master
        with:
          name: Lighthouse reports
          path: /tmp/artifacts

Overriding Config and Option Defaults

You can override default config and options by specifying overridesJsonFile option. Contents of this overrides JSON file can have two possible fields; options and config. These two fields are eventually used by Lighthouse to populate opts and config arguments respectively as illustrated in Using programmatically. The two objects populating this JSON file are merged shallowly with the default config and options.

Example content of overridesJsonFile

{
  "config": {
    "settings": {
      "onlyCategories": ["performance"]
    }
  },
  "options": {
    "chromeFlags": [
      "--disable-dev-shm-usage"
    ]
  }
}

CLI

Running lighthouse-check in the example below will run Lighthouse audits against https://www.foo.software and https://www.foo.software/contact and output a report in the '/tmp/artifacts' directory.

Format is --option <argument>. Example below.

$ lighthouse-check --urls "https://www.foo.software,https://www.foo.software/contact" \
  --outputDirectory /tmp/artifacts

lighthouse-check-status example

$ lighthouse-check-status --outputDirectory /tmp/artifacts \
  --minAccessibilityScore 90 \
  --minBestPracticesScore 90 \
  --minPerformanceScore 70 \
  --minProgressiveWebAppScore 70 \
  --minSeoScore 80

CLI Options

All options mirror the NPM module. The only difference is that array options like urls are passed in as a comma-separated string as an argument using the CLI.

Docker

$ docker pull foosoftware/lighthouse-check:latest
$ docker run foosoftware/lighthouse-check:latest \
  lighthouse-check --verbose \
  --urls "https://www.foo.software,https://www.foo.software/contact"

Options

lighthouse-check functions accept a single configuration object.

lighthouseCheck

You can choose from two ways of running audits - locally in your own environment or remotely via Automated Lighthouse Check API. You can think of local runs as the default implementation. For directions about how to run remotely see the Automated Lighthouse Check API Usage section. We denote which options are available to a run type with the Run Type values of either local, remote, or both.

Below are options for the exported lighthouseCheck function or lighthouse-check command with CLI.

validateStatus

results parameter is required or alternatively outputDirectory. To utilize outputDirectory - the same value would also need to be specified when calling lighthouseCheck.

Return Payload

lighthouseCheck function returns a promise which either resolves as an object or rejects as an error object. In both cases the payload will be of the same shape documented below.

Credits

This package was brought to you by Foo - a website performance monitoring tool. Create a free account with standard performance testing. Automatic website performance testing, uptime checks, charts showing performance metrics by day, month, and year. Foo also provides real time notifications when performance and uptime notifications when changes are detected. Users can integrate email, Slack and PagerDuty notifications.