npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

connectif-2-gc-storage

v0.4.0

Published

CLI to automate Connectif data export uploading to Google Cloud Storage

Downloads

10

Readme

connectif-2-gc-storage

connectif-2-gc-storage

Npm Version Actions Status CodeFactor codecov Dependabot Docker Pulls

A CLI that makes extremely easy automate export data from the Connectif Marketing Automation Platform and upload it to Google Cloud Storage.

Installation

Install the NodeJs runtime.

Now, from your favourite shell, install the CLI by typing the following command:

$ npm install -g connectif-2-gc-storage

Prerequisites

Before run the CLI we must ensure we have all the credentials in place in order to access the Connectif API and the Google Cloud Platform:

  • Connectif Api Key: get a Connectif Api Key with permission to write and read exports following the instructions that can be found here: https://api-docs.connectif.cloud/connectif-api/guides/authentication.
  • Google Cloud Credentials: get a credential file from the Google Cloud Console with permission to write into your Google Cloud Storage account (see instruction here https://cloud.google.com/docs/authentication/getting-started).

Usage

The usage documentation can be found running the CLI with the help flag:

$ connectif-2-gc-storage --help

Output:

Usage: connectif-2-gc-storage [options] [command]

CLI to automate Connectif data export uploading to Google Cloud Storage

Options:
  -V, --version                   output the version number
  -h, --help                      display help for command

Commands:
  export-activities [options]     export contacts activities.
  export-contacts [options]       export contacts.
  export-data-explorer [options]  export data explorer reports.
  help [command]                  display help for command

To get documentation of each command use help. i.e.:

$ connectif-2-gc-storage help export-activities

Output:

Usage: connectif-2-gc-storage export-activities [options]

export contacts activities.

Options:
  -k, --gcKeyFileName <path>            Path to a .json, .pem, or .p12 Google Cloud key file (required).
  -b, --gcBucketName <name>             Google Cloud Storage bucket name (required).
  -a, --connectifApiKey <apiKey>        Connectif Api Key. export:read and export:write scopes are required (required).
  -f, --fromDate <fromDate>             filter activities export created after a given date (required).
  -t, --toDate <toDate>                 filter activities export created before a given date (required).
  -s, --segmentId <segmentId>           filter the activities export of contacts in a given segment.
  -y, --activityTypes <activityTypes>   filter the activities export by activity types separated by comma (i.e.: purchase,login,register).
  -h, --help                            display help for command

Docker

In case you want to run the CLI using docker you can with the following commands:

The below will print the version of the CLI:

docker run --rm francescorivola/connectif-2-gc-storage:latest

The below will run the CLI with the given options:

docker run --rm -v $(pwd)/key.json:/home/node/key.json francescorivola/connectif-2-gc-storage:latest \
  export-activities \
  --gcKeyFileName=./key.json \
  --gcBucketName=$BUCKET_NAME \
  --connectifApiKey=$CONNECTIF_API_KEY \
  --fromDate=$FROM_DATE \
  --toDate=$TO_DATE

Use Case

An use case for this tool is to automate data analysis with Google Data Studio. Once the csv files are exported in Google Cloud Storage we can use it to feed Google Data Studio through its Google Cloud Storage Connector.

First Export/Import

We run a first time the CLI to export Connectif contacts activities of March.

$ connectif-2-gc-storage export-activities \
  --gcKeyFileName ./key.json \
  --gcBucketName $BUCKET_NAME \
  --connectifApiKey $CONNECTIF_API_KEY \
  --fromDate 2020-02-28T23:00:00.000Z \
  --toDate 2020-03-31T22:00:00.000Z

cli-run

Once done, we can check the result and we can see we have a csv file result under a folder export-activities in our bucket.

gc-storage-export-result

Connect GCS to Google Data Studio

First of all we add a new Google Cloud Storage Data Source.

data-studio-add-connector

Then we configure the connector checking the checkbox Use all files in path and we add the path composed by BUCKET_NAME/export-activities. Finally we click in the Connect button.

data-studio-gc-storage-connector

We configure our fields (for this example we just take the fields as come from the csv).

data-studio-fields

We start creating reports :).

data-studio-report

Second Export/Import

Now that we have configured our Google Cloud Storage with Google Data Studio and we have created our report/s we can run the CLI to add more data. This time let's export contacts activities of April.

$ connectif-2-gc-storage export-activities \
  --gcKeyFileName ./key.json \
  --gcBucketName $BUCKET_NAME \
  --connectifApiKey $CONNECTIF_API_KEY \
  --fromDate 2020-03-31T22:00:00.000Z \
  --toDate 2020-04-30T22:00:00.000Z

cli-run-2

If we check the Google Cloud Storage Browser we will see 2 csv files under our export-activities folder, the previous one and the new one imported.

gc-storage-export-result-2

Now, let's go back to our report in Google Data Studio, click in the Refresh button and the report will automatically update adding the April month data :).

data-studio-report-2

Happy Data Analysis!!

License

MIT