npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@jfischer/etcopydata

v0.12.1

Published

SFDX Plugin to populate an org with data for multiple related sObjects.

Downloads

48

Readme

ETCopyData

NOTE: This is a fork of the original project of ETCopyData.

SFDX Plugin to populate your orgs with data extracted from multiple sObjects.

Version NPM

Install

Install as plugin

sfdx plugins:install @jfischer/etcopydata

You'll be prompted that this, like any plugin, is not officially code-signed by Salesforce. If that's annoying, you can whitelist it

Install from source

  1. Install the SDFX CLI.
  2. Clone the repository: git clone https://github.com/j-fischer/ETCopyData.git
  3. Change directory cd ETCopyData
  4. Install npm modules: npm install --production
  5. Link the plugin: sfdx plugins:link .

Documentation

This plugin is highly configurable with a JSON file named ETCopyData.json located on the current folder you are using when running this plugin. If the file does not exist, the plugin creates the file before erroring out, this allows you to get the bare bones of the file and modify it.

ETCopyData.json

Sample

{
    "now": "2018-11-29T20:11:33.417Z",
    "orgSource": "dhOrg",
    "orgDestination": soTest,
    "sObjectsData": [
		{
			"name": "Account",
			"ignoreFields": "OwnerId",
			"maxRecords": 20,
			"orderBy": "Name",
			"where": "Industry = 'Technology'"
		}
	],
    "sObjectsMetadata": [
		{
			"name": "User",
			"fieldsToExport": "FirstName,LastName,Email,Id",
			"matchBy": "Email"
		}
	],
    "rootFolder": "./ETCopyData",
	"includeAllCustom": true,
	"stopOnErrors": true,
	"copyToProduction": false,
	"useBulkAPI": false,
	"ignoreFields": "OwnerId, CreatedBy, CreatedDate, CurrencyIsoCode",
	"maxRecordsEach": null,
	"deleteDestination": true,
	"pollingTimeout": 100000
}

Fields

| Field | Data Type | Description | | --- | --- | --- | | now | DateTime | Timestamp that automatically updates every time the plugin is executed. | | orgSource | String | SFDX alias given to the org that has the data you want to export. | | orgDestination1 | String | SFDX alias given to the org that receive the data that you import. | | sObjectsData2 | sObjectsData[] | List of custom or standard sObjects where the data is going to be exported from, and where it will be imported to. | | sObjectsMetadata3 | sObjectsMetadata[] | Metadata sObjects that will be used for importing your data. | | rootFolder | String | Folder used to store the exported data and where the data will be imported from. | | includeAllCustom | Boolean | True if you want all the customer sObjects, false if you only want the ones listed in the orgDestination section | | stopOnErrors | Boolean | True if you want to stop on errors deleting data or importing data, false and the errors will be reported back but they will not stop the execution. | | copyToProduction | Boolean | True if you want to import data to production, false if importing to a production org should throw an error. | | useBulkAPI | Boolean | True if you want to use the Bulk API for insert/upsert/update import requests, false if you want to use the SOAP API for insert/upsert/update import requests. | | ignoreFields4 | String | List of fields to ignore for every sObject, each field is separated with a comma. Example: "Field1c, Field2c, Field__3" | | maxRecordsEach5 | Integer | What is the maximum number of records to export for each sObject | | deleteDestination6 | Boolean | True if you want to delete the existing records in the destination org before you load the new records. | | pollingTimeout7 | Integer | Timeout in milliseconds that Bulk API operations will timeout. |

sObjectsData

Sample: Minimum fields required

You must provide the name of the sObject

{
	"name": "Account"
}

Sample: All fields complete

{
  "name": "Location__c",
  "deleteDestination": false,
  "externalIdField": "External_Id_Field__c",
  "ignoreFields": "OwnerId, IgnoreField__c",
  "maxRecords": 50,
  "orderBy": "City__c",
  "twoPassReferenceFields": "Foo__c,Bar__c",
  "twoPassUpdateFields": "Not_a_Lookup_Field__c,A_Text_Field__c",
  "where": "State__c = 'Texas'"
}

Fields

This is the structure for each sObject

| Field | Default | Data Type | Description | | ---------------------- | ------- | --------- | -------------------------------------------------------------------------------------------------------------------------- | | name | N/A | String | Required field. SObject API name rather than the label, which means that custom sObjects end with __c. | | deleteDestination | false | Boolean | If true will delete all records of the object prior to importing the data. | | ignoreFields | null | String[] | List of fields to ignore for every sObject, these list will be combined with the global ignoreFields field. | | maxRecords | -1 | Integer | Overrides the global maxRecordsEach field. | | orderBy | null | String | For exports, determines the order for the records that are exported. | | twoPassReferenceFields | null | String[] | For imports, lists the fields that need to be set using a separate update as they refer an SObject that is not loaded yet. | | twoPassUpdateFields | null | String[] | For imports, lists the fields that need to be set using a separate update request along with the twoPassreferenceFields.| | where | null | String | Restrict which records are be exported. | | externalIdField | null | String | API name of external ID field to be used for an upsert operation. |

sObjectsMetadata

Sample: Minimum fields required

{
	"name": "User",
	"fieldsToExport": "FirstName,LastName,Email,Id",
	"matchBy": "Email"
}

Sample: All fields complete

{
	"name": "User",
	"fieldsToExport": "FirstName,LastName,Email,Id",
	"matchBy": "Email",
	"orderBy": "LastName",
	"where": null
}

Fields

This is the structure for each metadata sObject

| Field | Default | Data Type | Description | | ------------------- | ------- | --------- | ------------------------------------------------------------------------------- | | name | N/A | String | Required field. SObject API name rather than the label. | | fieldsToExport | N/A | String[] | Required field. List of fields that will be exported for each metadata sObject. | | matchBy9 | N/A | String | Required field. What makes the two metadata sObjects the same? | | orderBy | null | String | For exports, determines the order for the metadata records that are exported. | | where | null | String | Restrict which records are be exported. |

Migrating Custom Settings

The tool supports the migration of Custom Settings. To handle the proper mapping of possible owners (User, Profile) of Hierarchical Custom Settings, it is important to to have the following configuration for the metadata objects added to the ETCopyData.json file:

"sObjectsMetadata": [
    {
        "name": "User",
        "fieldsToExport": "FirstName,LastName,Email,Id",
        "matchBy": "Email"
    },
    {
      "name": "Profile",
      "fieldsToExport": "Name,Id",
      "matchBy": "Name"
    }
]

References

ETCopyData fully supports importing references between SObjects, both Lookup and Parent/Child relationships.

ETCopyData determines an import order, based on the Lookup and Parent/Child relationships that are exported and not flagged as twoPassReferenceFields. It sorts the list of SObjects using the following algorithm:

  1. the SObjects that have no relationships to any other SObjects
  2. the SObjects that only have relationships to group 1
  3. the SObjects that have relationships to group 1 and/or 2
  4. etc.

ETCopyData imports the data for the SObjects in that order, keeping track of the mapping between Ids in the source set and their equivalent Ids the target system. When importing a reference field, it can immediately set the correct Id in the target system.

If your data model is tree-like, no additional configuration is needed to automatically import all references. If your data model contains cyclic references or self references, additional configuration using the 'twoPassReferenceField' setting. An example cyclic reference is SObject A having a lookup field for SObject B and SObject B having a lookup field for SObject A. An example self reference is SObject A having a lookup field for SObject A.

If your data model contains one of these types of references, you will get the following error during import:

Deadlock determining import order, most likely caused by circular or self reference, configure those fields as twoPassReferenceFields

Configuring twoPassReferenceFields can be automated, but currently is a manual process. In general, if you have two SObjects that reference each other through a single Lookup relationship in each SObject, you only need to flag one of those fields as a twoPassReferenceField.

As an example, assume you have the following SObject and fields:

  • SObject Ac: field RefBc of type Lookup(B__c)
  • SObject Bc: field RefAc of type Lookup(A__c)

If your dataset contains 1000 Ac records and 10 Bc records, the optimal configuration is to configure Bc.RefAc as twoPassReferenceField. On import, ETCopyData will execute the following steps:

  1. import all records for SObject Bc (keeping the RefAc field null), keeping track of the mapping between Id in the source set and the Id in the target system
  2. import all records for SObject Ac, setting the RefBc field correctly using the mapping, keeping track of the mapping the record Ids
  3. revisit all SObject Bc records that have a value for RefAc, and set the RefA__c field to the mapped Id

Notes:

  1. Because the data in the org gets modified, you are not allowed to use a production org. You can only use a scratch org or a sandbox!
  2. You must explicitly specify which standard sObjects you want to process because there are way too many standard sObjects and not a good way to determine which ones are useful. But for custom sObjects, you can specify that you want all of them.
  3. These records will not be imported but will need to exist in the destination org, so their record ids can be used when loading the data.
  4. These are some fields that are a good idea to ignore: OwnerId, CreatedBy, CreatedDate, CurrencyIsoCode.
  5. Not exporting all the records could have negative implications, especially if those records are required later. For example, not exporting master records (on a master/detail relationship) for detail records that you do actually export.
  6. Not deleting the existing records could end up with tons of records if the operation is run multiple times while testing, or have duplicate records in the destination sObject.
  7. If you are getting timeout errors while records are being deleted, or imported, you could increase the polling timeout.
  8. If you are getting out-of-memory errors, you can increase the amount of memory used by NodeJS (the engine used to run SFDX plugins) by setting the environment variable NODE_OPTIONS to --max-old-space-size=8192 to reserve 8GB memory.
  9. The metadata records in the source org and the destination org will have different IDs, but they should have similar characteristic that can be used for mapping. For example, for users, you can use the email, for profiles use their names, for record types use their developer name, etc. When dealing with Recordtypes that have same DeveloperName for different sObjects, the matchBy entry can be set as "SobjectType, DeveloperName".

Commands

sfdx ETCopyData:Compare [-c <string>] [-d <string>] [-s <string>] [-r] [--forceprodcopy] [--forceproddeletion] [-o] [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

Checks the source and destination org for any differences in the sObject's metadata, this helps determine what data can be properly exported/imported.

USAGE
  $ sfdx ETCopyData:Compare [-c <string>] [-d <string>] [-s <string>] [-r] [--forceprodcopy] [--forceproddeletion] [-o] 
  [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

OPTIONS
  -c, --configfolder=PATH                                                           Root folder to find the
                                                                                    configuration file

  -d, --orgdestination=(alias|username)                                             SFDX alias or username for the
                                                                                    DESTINATION org

  -o, --overrideconfig                                                              Will override the existing config
                                                                                    file, adding new properties, etc.

  -r, --deletedestination                                                           Delete records in destination org
                                                                                    prior to data loads

  -s, --orgsource=(alias|username)                                                  SFDX alias or username for the
                                                                                    SOURCE org

  --forceprodcopy                                                                   Force the copy to production

  --forceproddeletion                                                               Force the deletion of to production
                                                                                    data

  --json                                                                            format output as json

  --loglevel=(trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL)  [default: warn] logging level for
                                                                                    this command invocation

See code: src\commands\ETCopyData\Compare.ts

sfdx ETCopyData:delete [-c <string>] [-d <string>] [-s <string>] [-r] [--forceprodcopy] [--forceproddeletion] [-o] [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

Deletes data from destination org, preparing for the new data that will be uploaded. Note: Deleting optionally happens before loading, but if there are some errors this operation can be retried by itself.

USAGE
  $ sfdx ETCopyData:delete [-c <string>] [-d <string>] [-s <string>] [-r] [--forceprodcopy] [--forceproddeletion] [-o] 
  [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

OPTIONS
  -c, --configfolder=PATH                                                           Root folder to find the
                                                                                    configuration file

  -d, --orgdestination=(alias|username)                                             SFDX alias or username for the
                                                                                    DESTINATION org

  -o, --overrideconfig                                                              Will override the existing config
                                                                                    file, adding new properties, etc.

  -r, --deletedestination                                                           Delete records in destination org
                                                                                    prior to data loads

  -s, --orgsource=(alias|username)                                                  SFDX alias or username for the
                                                                                    SOURCE org

  --forceprodcopy                                                                   Force the copy to production

  --forceproddeletion                                                               Force the deletion of to production
                                                                                    data

  --json                                                                            format output as json

  --loglevel=(trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL)  [default: warn] logging level for
                                                                                    this command invocation

See code: src\commands\ETCopyData\delete.ts

sfdx ETCopyData:export [-c <string>] [-d <string>] [-s <string>] [-r] [--forceprodcopy] [--forceproddeletion] [-o] [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

Exports the data from the source org, and saves it in the destination folder so that it can be imported at a later time.

USAGE
  $ sfdx ETCopyData:export [-c <string>] [-d <string>] [-s <string>] [-r] [--forceprodcopy] [--forceproddeletion] [-o] 
  [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

OPTIONS
  -c, --configfolder=PATH                                                           Root folder to find the
                                                                                    configuration file

  -d, --orgdestination=(alias|username)                                             SFDX alias or username for the
                                                                                    DESTINATION org

  -o, --overrideconfig                                                              Will override the existing config
                                                                                    file, adding new properties, etc.

  -r, --deletedestination                                                           Delete records in destination org
                                                                                    prior to data loads

  -s, --orgsource=(alias|username)                                                  SFDX alias or username for the
                                                                                    SOURCE org

  --forceprodcopy                                                                   Force the copy to production

  --forceproddeletion                                                               Force the deletion of to production
                                                                                    data

  --json                                                                            format output as json

  --loglevel=(trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL)  [default: warn] logging level for
                                                                                    this command invocation

See code: src\commands\ETCopyData\export.ts

sfdx ETCopyData:full [-c <string>] [-d <string>] [-s <string>] [-r] [--forceprodcopy] [--forceproddeletion] [-o] [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

Performs all the steps, including comparing schemas, exporting data from the source, optionally deleting data from the destination, and importing the data to the destination org. This may help you when setting up a new process

USAGE
  $ sfdx ETCopyData:full [-c <string>] [-d <string>] [-s <string>] [-r] [--forceprodcopy] [--forceproddeletion] [-o] 
  [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

OPTIONS
  -c, --configfolder=PATH                                                           Root folder to find the
                                                                                    configuration file

  -d, --orgdestination=(alias|username)                                             SFDX alias or username for the
                                                                                    DESTINATION org

  -o, --overrideconfig                                                              Will override the existing config
                                                                                    file, adding new properties, etc.

  -r, --deletedestination                                                           Delete records in destination org
                                                                                    prior to data loads

  -s, --orgsource=(alias|username)                                                  SFDX alias or username for the
                                                                                    SOURCE org

  --forceprodcopy                                                                   Force the copy to production

  --forceproddeletion                                                               Force the deletion of to production
                                                                                    data

  --json                                                                            format output as json

  --loglevel=(trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL)  [default: warn] logging level for
                                                                                    this command invocation

See code: src\commands\ETCopyData\full.ts

sfdx ETCopyData:import [-c <string>] [-d <string>] [-s <string>] [-r] [--forceprodcopy] [--forceproddeletion] [-o] [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

Imports data into destination org, you can control if the data in the destination sObjects should be removed before loading a new data set. The data load happens in a specific order (children first, parents last) which has been determined by checking the schema in the destination org.

USAGE
  $ sfdx ETCopyData:import [-c <string>] [-d <string>] [-s <string>] [-r] [--forceprodcopy] [--forceproddeletion] [-o] 
  [--json] [--loglevel trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL]

OPTIONS
  -c, --configfolder=PATH                                                           Root folder to find the
                                                                                    configuration file

  -d, --orgdestination=(alias|username)                                             SFDX alias or username for the
                                                                                    DESTINATION org

  -o, --overrideconfig                                                              Will override the existing config
                                                                                    file, adding new properties, etc.

  -r, --deletedestination                                                           Delete records in destination org
                                                                                    prior to data loads

  -s, --orgsource=(alias|username)                                                  SFDX alias or username for the
                                                                                    SOURCE org

  --forceprodcopy                                                                   Force the copy to production

  --forceproddeletion                                                               Force the deletion of to production
                                                                                    data

  --json                                                                            format output as json

  --loglevel=(trace|debug|info|warn|error|fatal|TRACE|DEBUG|INFO|WARN|ERROR|FATAL)  [default: warn] logging level for
                                                                                    this command invocation

See code: src\commands\ETCopyData\import.ts