npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

deg-data-import

v0.4.7

Published

SFCC data archive & import tool

Downloads

42

Readme

DEG SFCC site import utility

DW JSON File

Assumes a dw.json file exists within the repository's root directory.

{
    "hostname": "dev01.demandware.net",
    "username": "[email protected]",
    "password": "password",
    "version": "version1"
}

install the package

npm install deg-data-import --save-dev

run the data import using node:

node ./node_modules/deg-data-import/deg data

or within a npm script

"scripts": {
    "site:import": "node node_modules/deg-data-import/deg data"
}

The above script will import everything except catalogs, pricebooks, inventory and libraries. The idea being to keep them light and able to update sandboxes in under a minute. If you wanted to include everything, like after a dbinit, then you can append the full option at the end of the previous command:

"scripts": {
    "import:full": "node node_modules/deg-data-import/deg data full"
}

Another option is to import only the libraries by appending lib. This includes any static assets as well:

"scripts": {
  "import:lib": "node node_modules/deg-data-import/deg data lib"
}

More applicable during code deployments, the code option is used to zip, upload and activate code versions. It can also be used within sandboxes to quickly "flip" code versions to invalidate any server cache.

"scripts": {
  "code:version": "node ./node_modules/pop-site-scrpts/pop code"
}

Migrations

Migrations are a sequence of site imports stored in the migrations folder in the root of the repo. These migrations are imported into the instance in alphabetical order.

Creating a new migration

The following command will create a timestamped migration folder that the developer can add site import data to:

  node node_modules/deg-data-import/deg migrate create "migration description here"

This will result in a new folder (ie: migrations/202105101234567_migration_description_here)

Setting a migration

The migration currently deployed on an instance is persisted as a global preference named migrationVersion. If this preference does not exist (brand new sandbox) or is empty (no migrations run yet) all migrations will be run. You can update this preference using the command:

  node node_modules/deg-data-import/deg migrate set migration_id_here

The next time migrations are applied all migrations that come after the current migration will be applied. Once the last migration in the qeue is applied this preference will be updated on the instance.

Sometimes when switching between branches you may have to use this command to reset the current migration.

Applying migrations

This command will apply all migrations that come after the current migration on the instance. It will error out if the current migration on record on the target instance is not present.

  node node_modules/deg-data-import/deg migrate apply

Bitbucket Pipelines

The package can be used to include data imports during Bitbucket Pipelione deployments. The included create-dw-json.sh can be used to create a dw.json file within a Bitbucket Pipeline step:

- chmod +x ./node_modules/deg-data-import/create-dw-json.sh
- ./node_modules/deg-data-import/create-dw-json.sh > dw.json

Bitbucket environment variables can be set per Deployment type:

{
    "hostname": "$HOSTNAME",
    "client_id": "$CLIENT_ID",
    "client_secret": "$CLIENT_SECRET",
    "cert_pass": "$BITBUCKET_COMMIT",
    "ocapi_endpoint": "$OCAPI_ENDPOINT",
    "build_number": "$BITBUCKET_BUILD_NUMBER",
    "code_version": "$BITBUCKET_COMMIT"
}

Site Import (sfcc archive)

A sfcc archive folder named site-import should exist within the repository's root directory.

A minimal site-import archive directory has been included within this package to quickly copy and test functionality.

Package JSON

The name of the site-archive to upload and source directory can be set by adding the config object below to the repository's package.json. Code versions can also be zipped, uploaded, imported and activated. The version name and directories to include are set here as well.

 "config": {
    "data": {
      "file_name": "site-import",
      "directories": [
        "site-import/"
      ]
    },
    "code": {
      "file_name": "version2",
      "directories": [
        "cartridges/",
        "sfra/cartridges/"
      ]
    }
  }

Business Manager Config

WebDAV permissions

uploading data and code to an instance require adding permissions for a client_id within:

Admin > Org > WebDAV Permission Settings

{
      "clients":
      [
        {
          "client_id": "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
          "permissions":
          [
            {
              "path": "/impex",
              "operations": [
                "read_write"
              ]
            },
            {
              "path": "/cartridges",
              "operations": [
                "read_write"
              ]
            }
          ]
        }
      ]
    }

OCAPI Permissions:

Add the following ocapi permission scopes to the client_id used by this utility: Admin > Site Dev > OCAPI API Settings

{
      "_v": "99.9",
      "clients":
      [
        {
          "client_id": "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
          "resources":
          [
            {
              "resource_id": "/code_versions",
              "methods": ["get"],
              "read_attributes": "(**)",
              "write_attributes": "(**)"
            },
            {
              "resource_id": "/code_versions/*",
              "methods": ["patch", "delete"],
              "read_attributes": "(**)",
              "write_attributes": "(**)"
            },
            {
              "resource_id": "/jobs/*/executions",
              "methods": ["post"],
              "read_attributes": "(**)",
              "write_attributes": "(**)"
            },
            {
              "resource_id": "/jobs/*/executions/*",
              "methods": ["get"],
              "read_attributes": "(**)",
              "write_attributes": "(**)"
            },
            { 
              "resource_id": "/sites/*/cartridges", 
              "methods": ["post"], 
              "read_attributes": "(**)", 
              "write_attributes": "(**)"
            }
          ]
        }
      ]
    }

2-Factor Authentication

build process will generate a certificate at runtime using pre-configured environment variables while using the current commit-hash as the passcode. The same commit-hash is added to the object below to allow connecting and uploading data and code to Staging instances.

var options = { 
    pfx: 'build_cert.p12', 
    passphrase: dw.cert_pass 
};

@TODO:

replace sfcc-ci repo url using environment variable:

git config --global url."https://${GITHUB_TOKEN}@github.com/".insteadOf [email protected]: