npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@jrmdayn/googleapis-batcher

v0.8.0

Published

A library for batching Google APIs requests in Node.js

Downloads

7,463

Readme

Batching library for Google APIs Node.js Client

Node.js library for batching multiple requests made with the official Google APIs Node.js client

Getting started

First, install the library using yarn/npm/pnpm:

yarn add @jrmdayn/googleapis-batcher

Then instantiate and use the batchFetchImplementation:

import { google } from 'googleapis'
import { batchFetchImplementation } from '@jrmdayn/googleapis-batcher'

const fetchImpl = batchFetchImplementation()

const client = google.calendar({
  version: 'v3',
  fetchImplementation: fetchImpl,
})

// The 3 requests will be batched together
const [list, get, patch] = await Promise.all([
    calendarClient.events.list({ calendarId: '[email protected]' }),
    calendarClient.events.get({
      calendarId: '[email protected]',
      eventId: 'xyz123'
    }),
    calendarClient.events.patch({
      calendarId: '[email protected]',
      eventId: 'xyz456',
      requestBody: { colorId: '1' }
    })
  ])

Options

maxBatchSize

Controls the maximum number of requests to batch together in one HTTP request.

// limit the number of batched requests to 50
const fetchImpl = batchFetchImplementation({ maxBatchSize: 50 })

Note: Google limits the number of batched requests on a per API basis. For example, for the Calendar API it is 50 requests and for the People API it is 1000.

batchWindowMs

Controls the size of the time window (in milliseconds) that will be used to batch requests together. By default, all requests made in the same tick will be batched together. See Dataloader documentation for more on this.

// batch all requests made in a 30ms window
const fetchImpl = batchFetchImplementation({ batchWindowMs: 30 })

signal

Defines a user controlled signal that is used to manually trigger a batch request.

const signal = makeBatchSchedulerSignal();
const fetchImpl = batchFetchImplementation({ signal })

const client = google.calendar({
  version: 'v3',
  fetchImplementation: fetchImpl,
})

const pList = calendarClient.events.list({ calendarId: '[email protected]' }),
const pGet = calendarClient.events.get({
  calendarId: '[email protected]',
  eventId: 'xyz123'
}),
const pPatch = calendarClient.events.patch({
  calendarId: '[email protected]',
  eventId: 'xyz456',
  requestBody: { colorId: '1' }
})

...

signal.schedule();

Known limitations

The max batch size varies per Google API. For example, it is set to 50 for Calendar API and to 1000 for People API. Read the docs to find out and configure the options accordingly.

Batching is homogeneous, so you cannot batch Calendar API and People API requests together. Instead, you must make 2 seperate batching calls, as there are 2 separate batching endpoints. Concretly what it means is that you should always provide a fetchImplementation at the client API level, not at the global Google options level:

const fetchImpl = batchFetchImplementation()

const calendarClient = google.calendar({
  version: 'v3',
  fetchImplementation: fetchImpl,
})

const peopleClient = google.people({
  version: 'v1',
  fetchImplementation: fetchImpl,
})

// This will raise an error
await Promise.all([
    calendarClient.events.list(),
    peopleClient.people.get()
  ])

Do this instead:

const fetchImpl1 = batchFetchImplementation()
const fetchImpl2 = batchFetchImplementation()

const calendarClient = google.calendar({
  version: 'v3',
  fetchImplementation: fetchImpl1,
})

const peopleClient = google.people({
  version: 'v1',
  fetchImplementation: fetchImpl2,
})

await Promise.all([
    calendarClient.events.list(),
    peopleClient.people.get()
  ])

Motivation

On August 12, 2020 Google deprecated its global batching endpoints (blog article here). Going forward, it is recommended to use API specific batch endpoints for batching homogeneous requests together. Unfortunately, the official Google APIs Node.js client does not support batching requests together out of the box. The task of composing a batched request and parsing the batch response is left to the developer.

Here is a link to the official guide for doing so with the Calendar API. As you can see, the task consists in handcrafting a multipart/mixed HTTP request composed of multiple raw HTTP requests (one per request), and then parsing a multipart/mixed response body composed of multiple raw HTTP responses (one per response).

At this point, I see at least 2 reasons as to why developers would not batch Google APIs requests:

  1. There is no easy way to easily generate the individual raw HTTP requests (url + headers + JSON body) from the official Node.js client. The only solution would be to read the developers doc and craft the request by hand..
  2. The task of handcrafting/parsing a multipart/mixed HTTP request/response seems daunting and error prone

Solution

I decided to write this library when I first encountered the need for batching Google APIs requests in Node.js, so that other developers would not have to face the task of writing and parsing multipart/mixed HTTP requests. The key of the solution consists of providing your own fetch implementation to the API client you are using. Google exposes a fetchImplementation parameter in the options (probably for testing purpose) that we can easily override to intercept requests and group them together. For grouping the requests together, we use Dataloader, which can be configured to batch all requests made in one tick, or in a certain time window, or until an external signal is fired.

From a developer's point of vue, you do not need to worry about handcrafting the individual raw HTTP requests. You simply use the official Google APIs Node.js client as normal, and the fetch implementation will automatically batch the requests for you.