npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@elbwalker/destination-node-bigquery

v2.1.1

Published

BigQuery node destination for walkerOS

Downloads

257

Readme

BigQuery destination for walkerOS

Why You Need this: Streamline your data collection and analysis by directly sending walkerOS events to Google BigQuery. This destination makes it easy to integrate your data pipeline with one of the most powerful data warehouses available.

Usage

The BigQuery destination allows you to send server-side walkerOS events to Google BigQuery. It handles the data transformation and ensures that your events are correctly formatted.

Basic example

Follow the setup steps first.

Install the package

npm i @elbwalker/destination-node-bigquery

Add and configure the BigQuery destination:

import { destinationBigQuery } from '@elbwalker/destination-node-bigquery';

elb('walker destination', destinationBigQuery, {
  custom: {
    projectId: 'PR0J3CT1D', // Required
    // client: BigQuery; // A BigQuery instance from @google-cloud/bigquery
    // datasetId: string; // 'walkerOS' as default
    // tableId: string; // 'events' as default
    // location: string; // 'EU' as default
    // bigquery?: BigQueryOptions; // BigQueryOptions from @google-cloud/bigquery
  },
});

Learn more how to authenticate with a service account key file using the custom bigquery options.

Setup

The destination requires an existing dataset and table to ingest data into. Replace PR0J3CT1D.walkerOS.events with your actual project ID, dataset and table names. Adjust the options if necessary, and run the query to create it.

CREATE TABLE `PR0J3CT1D.walkerOS.events` (
  timestamp TIMESTAMP NOT NULL,
  event STRING NOT NULL,
  data JSON,
  context JSON,
  globals JSON,
  custom JSON,
  user JSON,
  nested JSON,
  consent JSON,
  id STRING,
  trigger STRING,
  entity STRING,
  action STRING,
  timing NUMERIC,
  `group` STRING,
  count NUMERIC,
  version JSON,
  source JSON,
  createdAt TIMESTAMP NOT NULL
)
PARTITION BY DATE(timestamp)
OPTIONS (
  description="walkerOS raw events",
  partition_expiration_days=365, -- Automatically delete data older than 1 year
  require_partition_filter=true -- Enforce the use of partition filter in queries
);

Note: If you also need to create a new dataset, consider to actively enable physical storage billing model to eventually reduce your BigQuery costs. Based on your events a compression factor of 6 is possible, but may result in higher querying costs.

Permissions

When using Service Accounts (SAs) for Google Cloud BigQuery, it's recommended to follow the principle of least privilege. Never grant more permissions than what it needs to perform its intended functions.

Assign explicit permissions directly to datasets within BigQuery (using the share option). This ensures that the service account only has access to what is necessary for operation.

For more detailed information, refer to the official Google Cloud IAM documentation.

Who this package is for

This destination is ideal for data engineers and analysts who are already using Google BigQuery or plan to integrate it into their data stack. It's also useful for companies looking to centralize their data collection and analysis efforts.

Dependencies

Before using the BigQuery destination, ensure you have:

  • walkerOS node client
  • Google Cloud Platform account
  • BigQuery dataset and table
  • GCP service account with permissions to write to the events table