npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

to-mixpanel

v1.1.34

Published

ETL for data into Mixpanel from many sources

Downloads

1

Readme

to Mixpanel

wat.

toMixpanel is an ETL script in Node.js that provides one-time data migrations from common product analytics tools... to mixpanel.

It implements Mixpanel's /import , $merge and /engage endpoints

It uses service accounts for authentication, and can batch import millions of events and user profiles quickly.

This script is meant to be run locally and requires a JSON file for configuration.

tldr;

git clone https://github.com/ak--47/toMixpanel.git

cd toMixpanel/

npm install

node index.js ./path-To-JSON-config

alternatively:

npx to-mixpanel ./path-To-JSON-config

Detailed Instructions

Install Dependencies

This script uses npm to manage dependencies, similar to a web application.

After cloning the repo, cd into the /toMixpanel and run:

npm install

this only needs to be done once.

Config File

toMixpanel requires credentials for your source and your destination

Here's an example of a configuration file for amplitude => mixpanel:

{
  "source": {
    "name": "amplitude",
    "params": {
      "api_key": "{{amplitude api key}}",
      "api_secret": "{{ amplitude api secret }}",
      "start_date": "2021-09-17",
      "end_date": "2021-09-17"
    },
    "options": {
      "save_local_copy": true,
      "is EU?": false
    }
  },
  "destination": {
    "name": "mixpanel",
    "project_id": "{{ project id }}",
    "token": "{{ project token }}",
    "service_account_user": "{{ mp service account }}",
    "service_account_pass": "{{ mp service secret }}",
    "options": {
      "is EU?": false,
	  "recordsPerBatch": 2000
    }
  }
}

you can find more configuration examples in the repo.

supported sources

required params: api_key, api_secret, start_date, end_date, is EU?

that's right! you can use toMixpanel to migrate one mixpanel project to another!

required params: token, secret, start_date, end_date, is EU?, do_events, do_people

options: where (see docs), event (see docs), recordsPerBatch (in destination)

required params: filePath, event_name_col, distinct_id_col, time_col, insert_id_col (note: filePath can be EITHER a path to a CSV file or a folder which contains multiple CSV files)

required params: project_id, bucket_name, private_key_id, private_key, client_email, client_id, auth_uri, token_uri, auth_provider_x590_cert_url, client_x509_cert_url options: path_to_data (for large datasets, does line-by-line iteration)

*note: google analytics does not have public /export APIs, so you'll need to export your data to bigQuery first, and then export your bigQuery tables to google cloud storage as JSON. You can then create a service account in google cloud storage which can access the bucket; the above-mentioned values are given to you when you create a service account