npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

shane-sfdx-plugins

v4.43.0

Published

sfdx plugins by Shane McLaughlin

Downloads

547

Readme

shane-sfdx-plugins

install

sfdx plugins:install shane-sfdx-plugins

You'll be prompted that this, like any plugin, is not officially code-signed by Salesforce. If that's annoying, you can whitelist it

docs

what all is in here and how does it work?

install and run this sfdx shane -h

but you like README, you say? Good thing oclif auto-generates all this for me. :)

Contribute

Way, way down at the bottom.

More plugins

check out this repo for other people's plugins (and share via PR if you've created one)

Usage

$ npm install -g shane-sfdx-plugins
$ sfdx COMMAND
running command...
$ sfdx (-v|--version|version)
shane-sfdx-plugins/4.43.0 darwin-x64 node-v12.14.0
$ sfdx --help [COMMAND]
USAGE
  $ sfdx COMMAND
...

Commands

sfdx data:sosl:query -q <string> [-u <string>] [--apiversion <string>] [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

Runs a sosl query. SOSL Reference: https://developer.salesforce.com/docs/atlas.en-us.soql_sosl.meta/soql_sosl/sforce_api_calls_sosl_syntax.htm

USAGE
  $ sfdx data:sosl:query -q <string> [-u <string>] [--apiversion <string>] [--json] [--loglevel
  trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

OPTIONS
  -q, --query=query                                                                 (required) SOSL query

  -u, --targetusername=targetusername                                               username or alias for the target
                                                                                    org; overrides default target org

  --apiversion=apiversion                                                           override the api version used for
                                                                                    api requests made by this command

  --json                                                                            format output as json

  --loglevel=(trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL)  [default: warn] logging level for
                                                                                    this command invocation

ALIASES
  $ sfdx shane:data:sosl:query
  $ sfdx force:data:sosl:query
  $ sfdx shane:data:search
  $ sfdx force:data:search
  $ sfdx force:data:sosl
  $ sfdx shane:data:sosl

EXAMPLES
  sfdx force:data:sosl:query -q "find {something}"

  sfdx force:data:sosl:query -q "find {Jack} returning User(Name), Account(Name),Contact(FirstName,LastName,Department)"
  -u platformers
  // search across several objects with different results fields on a specified org

See code: @mshanemc/sfdx-sosl

sfdx shane:ai:auth [-e <email>] [-f <filepath>] [-t <integer>] [-l <string>] [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

get an access token from an email and a .pem file, either passed in or from environment variables

USAGE
  $ sfdx shane:ai:auth [-e <email>] [-f <filepath>] [-t <integer>] [-l <string>] [--json] [--loglevel
  trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

OPTIONS
  -e, --email=email                                                                 email address you used when you
                                                                                    signed up for your einstein.ai
                                                                                    account

  -f, --certfile=certfile                                                           path to your private key from when
                                                                                    you signed up

  -l, --level=local|global                                                          [default: local] where to store this
                                                                                    config

  -t, --tokentime=tokentime                                                         [default: 1440] time in minutes that
                                                                                    you want your token to be valid for

  --json                                                                            format output as json

  --loglevel=(trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL)  [default: warn] logging level for
                                                                                    this command invocation

EXAMPLE
  sfdx shane:ai:auth -e [email protected] -f ~/code/certs/einstein_platform.pem
       // reauths, and takes what it can get

See code: src/commands/shane/ai/auth.ts

sfdx shane:ai:dataset:delete -n <string> [-e <email>] [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

delete a dataset

USAGE
  $ sfdx shane:ai:dataset:delete -n <string> [-e <email>] [--json] [--loglevel
  trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

OPTIONS
  -e, --email=email                                                                 email address you used when you
                                                                                    signed up for your einstein.ai
                                                                                    account

  -n, --dataset=dataset                                                             (required) dataset id

  --json                                                                            format output as json

  --loglevel=(trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL)  [default: warn] logging level for
                                                                                    this command invocation

EXAMPLE
  sfdx shane:ai:dataset:delete -n 57

See code: src/commands/shane/ai/dataset/delete.ts

sfdx shane:ai:dataset:get -n <string> [-l] [-e <email>] [-p] [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

get an access token from an email and a .pem file, either passed in or from environment variables

USAGE
  $ sfdx shane:ai:dataset:get -n <string> [-l] [-e <email>] [-p] [--json] [--loglevel
  trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

OPTIONS
  -e, --email=email                                                                 email address you used when you
                                                                                    signed up for your einstein.ai
                                                                                    account

  -l, --language                                                                    use the language endpoint instead of
                                                                                    vision

  -n, --dataset=dataset                                                             (required) dataset id

  -p, --poll                                                                        poll for the status to be completed

  --json                                                                            format output as json

  --loglevel=(trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL)  [default: warn] logging level for
                                                                                    this command invocation

EXAMPLE
  sfdx shane:ai:dataset:get -n 57

See code: src/commands/shane/ai/dataset/get.ts

sfdx shane:ai:dataset:upload [-n <string>] [-f <filepath>] [-p <string>] [-t <string>] [--train] [-e <email>] [-w <integer>] [--verbose] [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

upload a dataset

USAGE
  $ sfdx shane:ai:dataset:upload [-n <string>] [-f <filepath>] [-p <string>] [-t <string>] [--train] [-e <email>] [-w
  <integer>] [--verbose] [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

OPTIONS
  -e, --email=email
      email address you used when you signed up for your einstein.ai account

  -f, --file=file
      Path to the .zip (image) or .csv/.tsv/.json (language) file on the local drive (FilePart). The maximum file size you
      can upload from a local drive is 50 MB for images, 25 MB for text

  -n, --name=name
      Name of the dataset. Optional. If this parameter is omitted, the dataset name is derived from the .zip file name.

  -p, --path=path
      URL of the .zip (image) or .csv/.tsv/.json (language) file. The maximum file size you can upload from a web location
      is 2 GB (images), 25MB (text)

  -t, --type=image|image-detection|image-multi-label|text-intent|text-sentiment
      [default: image] Type of dataset data. Valid values are:

  -w, --wait=wait
      [default: 10] how long to wait for this to process (minutes)

  --json
      format output as json

  --loglevel=(trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL)
      [default: warn] logging level for this command invocation

  --train
      train a model on the dataset

  --verbose
      emit additional command output to stdout

EXAMPLE
  sfdx shane:ai:dataset:upload -e [email protected] -f ~/myPics.zip -n AwesomeDataset

See code: src/commands/shane/ai/dataset/upload.ts

sfdx shane:ai:playground:setup -f <filepath> [-e <email>] [-k <string>] [-u <string>] [--apiversion <string>] [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

upload .pem file from local encrypted copy, setup username and secret key in custom setting

USAGE
  $ sfdx shane:ai:playground:setup -f <filepath> [-e <email>] [-k <string>] [-u <string>] [--apiversion <string>]
  [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

OPTIONS
  -e, --email=email                                                                 email address you used when you
                                                                                    signed up for your einstein.ai
                                                                                    account.  Defaults to EINSTEIN_EMAIL
                                                                                    from the environment

  -f, --file=file                                                                   (required) encrypted file from local
                                                                                    filesystem

  -k, --key=key                                                                     encoding key used to encrypt/decrypt
                                                                                    the file.  Defaults to
                                                                                    AI_PLAYGROUND_SETUP_KEY from the
                                                                                    environment

  -u, --targetusername=targetusername                                               username or alias for the target
                                                                                    org; overrides default target org

  --apiversion=apiversion                                                           override the api version used for
                                                                                    api requests made by this command

  --json                                                                            format output as json

  --loglevel=(trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL)  [default: warn] logging level for
                                                                                    this command invocation

EXAMPLE
  sfdx shane:ai:playground:setup -f my.pem -e [email protected] -k yay9HVn68GzXrqhT0HWkoQ==

See code: src/commands/shane/ai/playground/setup.ts

sfdx shane:ai:playground:setupHeroku [-a <string>] [-c] [-k] [-u <string>] [--apiversion <string>] [--verbose] [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

provisions a new einstein.ai account and sets up the org

USAGE
  $ sfdx shane:ai:playground:setupHeroku [-a <string>] [-c] [-k] [-u <string>] [--apiversion <string>] [--verbose]
  [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

OPTIONS
  -a, --app=app                                                                     name of the heroku app that we
                                                                                    attach add-ons to

  -c, --create                                                                      create the app

  -k, --keepauth                                                                    save the refresh token for
                                                                                    einstein.ai to the local sfdx store
                                                                                    for future cli use

  -u, --targetusername=targetusername                                               username or alias for the target
                                                                                    org; overrides default target org

  --apiversion=apiversion                                                           override the api version used for
                                                                                    api requests made by this command

  --json                                                                            format output as json

  --loglevel=(trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL)  [default: warn] logging level for
                                                                                    this command invocation

  --verbose                                                                         emit additional command output to
                                                                                    stdout

EXAMPLES
  sfdx shane:ai:playground:herokuSetup -a my-existing-app
       // creates addons to existing app

  sfdx shane:ai:playground:herokuSetup -c
       // creates an app with whatever name heroku feels like

  sfdx shane:ai:playground:herokuSetup -a non-existing-app -c
       // creates a new app with the name of your choice (usually build dynamically!)

See code: src/commands/shane/ai/playground/setupHeroku.ts

sfdx shane:analytics:app:share -n <string> [--allprm -c] [--allcsp undefined] [--org] [-t <string>] [-u <string>] [--apiversion <string>] [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

share an analytics app by name

USAGE
  $ sfdx shane:analytics:app:share -n <string> [--allprm -c] [--allcsp undefined] [--org] [-t <string>] [-u <string>]
  [--apiversion <string>] [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

OPTIONS
  -c, --community                                                                   enable community sharing
  -n, --name=name                                                                   (required) name of the analytics app
  -t, --type=View|Edit|Manage                                                       [default: View] access level

  -u, --targetusername=targetusername                                               username or alias for the target
                                                                                    org; overrides default target org

  --allcsp                                                                          share with all customer portal users

  --allprm                                                                          share with all partner users

  --apiversion=apiversion                                                           override the api version used for
                                                                                    api requests made by this command

  --json                                                                            format output as json

  --loglevel=(trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL)  [default: warn] logging level for
                                                                                    this command invocation

  --org                                                                             share with all internal users

EXAMPLE
  sfdx shane:analytics:app:share -n SharedApp --allprm -c
  // share the standard SharedApp with all partners view level perms (default) and check the "enable sharing with
  communities" box for this app

See code: src/commands/shane/analytics/app/share.ts

sfdx shane:analytics:community:enable [-b] [-u <string>] [--apiversion <string>] [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

Activate a community using a headless browser

USAGE
  $ sfdx shane:analytics:community:enable [-b] [-u <string>] [--apiversion <string>] [--json] [--loglevel
  trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

OPTIONS
  -b, --showbrowser                                                                 show the browser...useful for local
                                                                                    debugging

  -u, --targetusername=targetusername                                               username or alias for the target
                                                                                    org; overrides default target org

  --apiversion=apiversion                                                           override the api version used for
                                                                                    api requests made by this command

  --json                                                                            format output as json

  --loglevel=(trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL)  [default: warn] logging level for
                                                                                    this command invocation

ALIASES
  $ sfdx shane:communities:analytics:enable

See code: src/commands/shane/analytics/community/enable.ts

sfdx shane:analytics:dataflow:start [-n <string>] [-i <id>] [-u <string>] [--apiversion <string>] [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

start an analytics dataflow by name/label/id

USAGE
  $ sfdx shane:analytics:dataflow:start [-n <string>] [-i <id>] [-u <string>] [--apiversion <string>] [--json]
  [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

OPTIONS
  -i, --id=id                                                                       the id of the dataflow

  -n, --name=name                                                                   name or label of the analytics app
                                                                                    (will match either)

  -u, --targetusername=targetusername                                               username or alias for the target
                                                                                    org; overrides default target org

  --apiversion=apiversion                                                           override the api version used for
                                                                                    api requests made by this command

  --json                                                                            format output as json

  --loglevel=(trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL)  [default: warn] logging level for
                                                                                    this command invocation

EXAMPLE
  sfdx shane:analytics:dataflow:start -n MyDataFlowName
  // enqueue a job for the the analytics dataflow with name/label MyDataFlowName (will not wait for completion of the
  dataflow)

See code: src/commands/shane/analytics/dataflow/start.ts

sfdx shane:analytics:dataset:download [-i <id>] [-n <string>] [--versionid <string>] [-t <filepath>] [-r <number>] [-o <number>] [-b <number>] [-u <string>] [--apiversion <string>] [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

download a dataset as csv

USAGE
  $ sfdx shane:analytics:dataset:download [-i <id>] [-n <string>] [--versionid <string>] [-t <filepath>] [-r <number>]
  [-o <number>] [-b <number>] [-u <string>] [--apiversion <string>] [--json] [--loglevel
  trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

OPTIONS
  -b, --batchsize=batchsize                                                         [default: 1000000000] maximum
                                                                                    batchsize. Splits query in parts of
                                                                                    this size.

  -i, --id=id                                                                       dataset id

  -n, --name=name                                                                   dataset name

  -o, --offset=offset                                                               offset for rows

  -r, --rows=rows                                                                   [default: 1000000000] how many rows?

  -t, --target=target                                                               [default: .] where you want to save
                                                                                    the file

  -u, --targetusername=targetusername                                               username or alias for the target
                                                                                    org; overrides default target org

  --apiversion=apiversion                                                           override the api version used for
                                                                                    api requests made by this command

  --json                                                                            format output as json

  --loglevel=(trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL)  [default: warn] logging level for
                                                                                    this command invocation

  --versionid=versionid                                                             specify a version

EXAMPLES
  sfdx shane:analytics:dataset:download -n YourDataSetName -t myLocalFolder
  sfdx shane:analytics:dataset:download -i 0Fb6A000000gDFxSAM --versionid 0Fc6A000002d8GwSAI -t myLocalFolder -r 10000
  -b 5000

See code: src/commands/shane/analytics/dataset/download.ts

sfdx shane:analytics:dataset:list [-u <string>] [--apiversion <string>] [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

what analytics datasets are in my org?

USAGE
  $ sfdx shane:analytics:dataset:list [-u <string>] [--apiversion <string>] [--json] [--loglevel
  trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

OPTIONS
  -u, --targetusername=targetusername                                               username or alias for the target
                                                                                    org; overrides default target org

  --apiversion=apiversion                                                           override the api version used for
                                                                                    api requests made by this command

  --json                                                                            format output as json

  --loglevel=(trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL)  [default: warn] logging level for
                                                                                    this command invocation

EXAMPLE
  sfdx shane:analytics:dataset:list

See code: src/commands/shane/analytics/dataset/list.ts

sfdx shane:analytics:dataset:upload -n <string> -f <filepath> [-a <string>] [-m <filepath>] [-o <string>] [--async] [-d <integer>] [--serial] [-u <string>] [--apiversion <string>] [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

upload a dataset from csv

USAGE
  $ sfdx shane:analytics:dataset:upload -n <string> -f <filepath> [-a <string>] [-m <filepath>] [-o <string>] [--async]
  [-d <integer>] [--serial] [-u <string>] [--apiversion <string>] [--json] [--loglevel
  trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

OPTIONS
  -a, --app=app
      app name

  -d, --uploadinterval=uploadinterval
      [default: 500] milliseconds between uploaded chunks...increase this if you get strange errors during file uploads
      like "write EPIPE"

  -f, --csvfile=csvfile
      (required) local csv file containing the data

  -m, --metajson=metajson
      path to json file for describing your upload (highly recommended)

  -n, --name=name
      (required) dataset name--no spaces, should be like an api name

  -o, --operation=Append|Overwrite|Upsert|Delete
      [default: Overwrite] what to do with the dataset if it already exists.  See
      https://developer.salesforce.com/docs/atlas.en-us.bi_dev_guide_ext_data.meta/bi_dev_guide_ext_data/bi_ext_data_objec
      t_externaldata.htm

  -u, --targetusername=targetusername
      username or alias for the target org; overrides default target org

  --apiversion=apiversion
      override the api version used for api requests made by this command

  --async
      do not wait for successful completion of the dataset upload...just return and hope for the best.  If omitted, will
      poll the analytics rest API for job processing status until complete

  --json
      format output as json

  --loglevel=(trace|debug|info|warn|error|fatal|TRACE|