npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@fusebit/apimeter

v1.0.0

Published

Simple API metering for Express apps using BigQuery

Downloads

3

Readme

Simple API metering for Express apps using BigQuery

This project contains an Express middleware that implements a simple HTTP API metering solution for Node.js apps based on Google Cloud's BigQuery. Read more about the background of the project in the API Metering and Analytics for Early Stage Startups blog post.

If you have an Express app you want to instrument, you can do it in under 10 minutes. Let's go!

Getting started

First, set up a BigQuery table in Google Cloud using the instructions here.

Next, create a Google Cloud service account with permissions to the BigQuery table, add API keys to that account, and export them to a JSON file using the instructions here. Set the environment variable to point to the JSON file with those credentials:

export GOOGLE_APPLICATION_CREDENTIALS={full-path-to-credentials-file}

Next, build the module and run the sample HTTP API server:

git clone [email protected]:fusebit/apimeter.git
cd apimeter
npm install
npm run build
nmp run sample

NOTE If you have selected Google Cloud project name, dataset name, or table name different from the ones in the setup instructions, make sure to update respective names in sample/server.js before running the sample server.

Lastly, issue some test requests to the sample HTTP API server:

curl http://localhost:3000/api/cat
curl http://localhost:3000/api/cat/123
curl http://localhost:3000/api/cat -X POST
curl http://localhost:3000/api/cat/456 -X PUT

Finally, head over to the Google Cloud Console for BigQuery and query the metering results:

Once your metering data is in BigQuery, it is easy to create reports and visualizations over it using Google's free Data Studio.

Default configuration

To use the default configuration, add apimeter as the global middleware for all routes of your app:

const app = require("express")();
const { apimeter } = require("@fusebit/apimeter");

app.use(apimeter());

The default configuation:

  1. Logs metering data to the apimeter Google Cloud project, dwh dataset, and apicalls BigQuery table.
  2. Collects events in-memory and uploads them to BigQuery in batches when 500 records accumulate or every 5 seconds, whichever comes first.
  3. The service account credentials must be in a JSON file pointed to by the GOOGLE_APPLICATION_CREDENTIALS environment variable.

Custom BigQuery dataset and table and Google Cloud projectId

You can customize the Google Cloud projectId and BigQuery dataset and table names where records will be sent:

app.use(apimeter({
  projecId: "apimeter",
  dataset: "dwh",
  table: "apicalls",
}));

Credentials

You can provide Google Cloud service account credentials programmatically instead of the GOOGLE_APPLICATION_CREDENTIALS environment variable:

const credentials = require("./credentials.json");

app.use(apimeter({
  credentials
}));

Batch size and flush frequency

You can adjust the maximum batch size or the frequency of uploads to BigQuery:

app.use(apimeter({
  maxBatchSize: 500,
  flushIntervalMilliseconds: 5000,
}));

Customize metering data

You may choose to capture additional information from the request or response to store in the BigQuery table. For example, you could capture the response status code or the request processing time using this mechanism.

NOTE Every field you return must have a corresponding column in the BigQuery table.

const { apimeter, defaultGetApiMeterRecord } = require("@fusebit/apimeter");

app.use(apimeter({
  getApiMeterRecord: (req, res) => ({
    ...defaultGetApiMeterRecord(req, res),
    issuer: req.user.jwt.iss,
    subject: req.user.jwt.sub,
    status: res.status,
  })
}));

Flush callback

You can be called whenever the upload to BigQuery has finished either successfuly or with an error. Useful for logging.

app.use(apimeter({
  onFlush: (records, error) => {
    console.log('BigQuery upload finished', records.length, error);
  }
}));