npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

jy-transform

v2.0.1

Published

This project aims to read, write and transform YAML, JS or JSON objects into each other using CLI or API, while the source and destination resources can be files on CLI and additionally, objects or streams on API level.

Downloads

2,028

Readme

jy-transform logo

Stats

General

| License | Issues | Releases | Tags | Travis CI | Waffle | Code Climate | | --- | --- | --- | --- | --- | --- | --- | | License | Issue Stats | Releases | Tags | Build Status | Waffle | Code Climate |

Branches

| Branch | Codecov | Coveralls | Inch CI | David DM | David DM (dev) | | --- | --- | --- | --- | --- | --- | | master | codecov.io | coveralls.io | inch-ci.org | Dependency Status | devDependency Status | | development | codecov.io | coveralls.io | inch-ci.org | Dependency Status | devDependency Status |

Coverage

| master | development | | --- | --- | | codecov.io | codecov.io |

NPM

NPM NPM

TOC

jy-transform

This project aims to read, write and transform YAML, JS or JSON objects into each other using CLI or API, while the source and destination resources can be files on CLI and additionally, objects or streams on API level.

Installation

Download node at nodejs.org and install it, if you haven't already.

npm install jy-transform --global

Tests

npm install
npm test

Dependencies

  • bluebird: Full featured Promises/A+ implementation with exceptionally good performance
  • cli: A tool for rapidly building command line apps
  • is-stream: Check if something is a Node.js stream
  • js-yaml: YAML 1.2 parser and serializer
  • json-stringify-safe: Like JSON.stringify, but doesn't blow up on circular refs.
  • mkdirp-then: mkdirp as promised
  • serialize-js: User-readable object serialization for JavaScript.

Dev Dependencies

  • codeclimate-test-reporter: Code Climate test reporter client for javascript projects
  • codecov: Uploading report to Codecov: https://codecov.io
  • coveralls: takes json-cov output into stdin and POSTs to coveralls.io
  • doctoc: Generates TOC for markdown files of local git repo.
  • fs-extra: fs-extra contains methods that aren't included in the vanilla Node.js fs package. Such as mkdir -p, cp -r, and rm -rf.
  • istanbul: Yet another JS code coverage tool that computes statement, line, function and branch coverage with module loader hooks to transparently add coverage when running tests. Supports all JS coverage use cases including unit tests, server side functional tests
  • jsdoc-parse: Transforms jsdoc data into something more suitable for use as template input
  • jsdoc-to-markdown: Generates markdown API documentation from jsdoc annotated source code
  • mocha: simple, flexible, fun test framework
  • mocha-lcov-reporter: LCOV reporter for Mocha
  • object-path: Access deep object properties using a path
  • package-json-to-readme: Generate a README.md from package.json contents
  • winston: A multi-transport async logging library for Node.js

License

SEE LICENSE IN LICENSE.md

Motivation

Why this module? After struggling with some huge YAML file and accidentally occurring wrong indentions which results in an annoying failure investigation, I decided to get rid of the YAML file and therefore, create a module which should be aimed as the swiss army knife for transforming YAML, JS and JSON types into each other format.

Usage

The module can be used on CLI or as API (the latter is fully Promise based).

Usage Types

Since the module can be used in two different ways, use installation as follows:

  • CLI: install globally via -g option
  • API: install locally

Both usage types are described in more detail in the following sections.

Use Cases

So, what are the typical use cases for this module? In terms of transformation these consists of different phases:

  • Reading files (Reader)
  • Transforming JSON objects (Transformer)
  • Apply dedicated actions on the intermediate JSON objects (Transformer + Middleware)
  • Writing files (Writer)

Reading

Reading from:

  • *.yaml file
  • *.js file
  • *.json file

Additionally, on API level to a:

  • stream.Readable
  • Serialized JSON and YAML
  • Requires options.origin property set
  • Reads as UTF-8
  • JS object (actually, this means the reading phase is skipped, because object is in-memory already)

Transformation

The transformation can take place into several directions:

  • YAML ⇒ JS
  • YAML ⇒ JSON
  • JS ⇒ YAML
  • JSON ⇒ YAML
  • JS ⇒ JSON
  • JSON ⇒ JS
  • YAML ⇒ YAML
  • JSON ⇒ JSON
  • JS ⇒ JS

while:

  • YAML = *.yaml, *.yml
  • JS = *.js (JS object)
  • JSON = *.json (JS object serialized as JSON)

Middleware

Apply actions on the intermediate JS object via injected Promise functions. This is an optional part for transformation phase.

Writing

Writing to:

  • *.yaml file
  • *.js file
  • *.json file

Additionally, on API level to a:

  • stream.Writable
  • Serialized JS, JSON and YAML
  • Requires options.target property set
  • Writes UTF-8
  • JS object

Limitations

  • Since this module is build to transform from and to different type formats, any Functions residing in JS type objects are not supported, e.g. transforming

    module.exports = {
        fooKey: 'foo',
        fooFunction: function foo() {
            //...
        }
    }

    to JSON would simply result in

    {
        "fooKey": "foo"
    }

    while transforming to YAML type would even result in an Error, e.g. printed on CLI usage like this:

    ERROR: YAMLException: unacceptable kind of an object to dump [object Function]
  • Multidocument handling would be a cool feature which refers in general to YAML and JS only, but at the moment we require that each document to transform is a single one per source (or in case of JS could be identified)! This feature is planned and reflected in #14.

  • Schema validation for input and output is another topic which is planned by #1 and #2.

CLI Usage

The CLI provides the jyt command (actually, this might require the use of options). After the global installation you can access the Transformer's command options with the usual --help option which prints an overview about all available CLI properties:

$ jyt --help
Usage:
  jyt INPUT-FILE [OUTPUT-FILE] [OPTIONS]

Options:
  -o, --origin [STRING]  The origin type of INPUT-FILE: [ js | json | yaml ]. (Default is if not given, the type is tried to be inferred from the extension of source path, else it is 'yaml')
  -t, --target [STRING]  The target type of OUTPUT-FILE: [ js | json | yaml ]. (Default is if not given, the type is tried to be inferred from the extension of destination path, else it is 'js')
  -i, --indent [NUMBER]  The indention for pretty-print: 1 - 8. (Default is 4)
  -f, --force            Force overwriting of existing output files on write phase. When files are not overwritten (which is default),
                         then the next transformation with same output file name gets a consecutive number on the base file name, e.g. in
                         case of foo.yaml it would be foo(1).yaml.
  -m, --imports STRING   Define a 'module.exports[.identifier] = ' identifier (to read from JS _source_ file only, must be a valid JS
                         identifier!).
  -x, --exports STRING   Define a 'module.exports[.identifier] = ' identifier (for usage in JS destination file only, must be a valid JS
                         identifier!).
  -k, --no-color         Omit color from output
      --debug            Show debug information
  -v, --version          Display the current version
  -h, --help             Display help and usage details

CLI Args

The ARGS are more formally defined in the following table:

| Arg | Type | Description | Default | Required | | --- | --- | --- | --- | --- | | INPUT-FILE | URI | The source file path for transformation. | - | yes | | OUTPUT-FILE | URI | The destination file path to transform to. | When this options is omitted then the output file is stored relative to the input file (same base name but with another extension if type differs). If input and output type are the same then the file overwriting is handled depending on the --force value! | no |

NOTE: the input file has to be specified and should be first argument (in fact, it can be anywhere but it must be before an out file argument)!

CLI Options

The OPTIONS are more formally defined in the following table:

| Option (short) | Option (long) | Type | Description | Default | Required | | --- | --- | --- | --- | --- | --- | | -o | --origin | string of: [ js | json | yaml ] | The transformation origin type. | if not given, the type is tried to be inferred from the extension of source path, else it is yaml | no | | -t | --target | string of: [ js | json | yaml ] | The transformation target type. | if not given, the type is tried to be inferred from the extension of destination path, else it is js | no | | -i | --indent | integer[ 1 - 8 ] | The code indention used in destination files. | 4 | no | | -f | --force | n/a | Force overwriting of existing output files on write phase. When files are not overwritten (which is default), then the next transformation with same output file name gets a consecutive number on the base file name, e.g. in case of foo.yaml it would be foo(1).yaml. | false | no | | -m | --imports | string | Define a 'module.exports[.identifier] = ' identifier (to read from JS source file only, must be a valid JS identifier!) | undefined | no | | -x | --exports | string | Define a 'module.exports[.identifier] = ' identifier (for usage in JS destination file only, must be a valid JS identifier!) | undefined | no | | -k | --no-color | n/a | Omit color from output. | color | no | | n/a | --debug | n/a | Show debug information. | false | no | | -v | --version | n/a | Display the current version. | n/a | no | | -h | --help | n/a | Display help and usage details. | n/a | no |

NOTE: an invalid indention setting (1 > -i, --indent > 8) does not raise an error but a default of 4 SPACEs is applied instead.

Examples

Now we know which properties can be applied on CLI, so let's assume we have a YAML content located in foo.yaml holding this data:

foo: bar

Example: YAML ⇒ JSON

Then we can transform it to a JSON content as foo.json file:

{
  "foo": "bar"
}

simply by using this command:

$ jyt foo.yaml -t json -i 2

In this example we have overwritten the standard target type (which is js) and applying an indent of 2 SPACEs instead of the default (4). As default the output file foo.json is written relative to the input file (by omitting the dest option here).

NOTE: here you have to provide the target with option -t json or else the default js would have been applied! If the source would have been a js type like

$ jyt foo.js -t json -i 2

then the js value for origin is automatically inferred from file extension. Accordingly, this is also true for the target option.

Example: JSON ⇒ JS

The command

$ jyt foo.json -i 2

results in foo.js:

module.exports = {
  foo: "bar"
}

Example: JS ⇒ YAML

The command

$ jyt foo.js -t yaml

results in foo.yaml:

foo: bar

Example: Transformation with Different Destination

Simply specify the output file with a different file name:

$ jyt foo.json results/foobar.yaml

Example: Transformation with Unsupported Source File Extension

As said, normally we infer from file extension to the type, but assume the source file has a file name which does not imply the type (here a JSON type in a TEXT file), then you can simply provide the -o option with the correct origin type (of course, the -t option works analogous):

$ jyt foo.txt foobar.yaml -o json

Example: Read from File with Exports Identifier

It could be that a JS source exports several objects and you want to read from exactly the one you specify, then provide the -m (--imports) option.

In this this example we have a foo.js file exporting two objects:

module.exports.foo = {
    foo: 'bar'
};

module.exports.bar = {
    bar: 'foo'
};

but you want to convert only bar object, then call:

$ jyt foo.js bar.yaml -m bar

to get the YAML result:

bar: foo

NOTE: the same applies on API level when using JS objects as dest:

var fooBar = {
    foo: 'bar',
    bar: 'foo'
};

var options = {
    src: fooBar,
    dest: {},
    exports: 'bar'
};

//...transform

The transformation will result in this in-memory object:

bar: {
    foo: 'bar',
    bar: 'foo'
}

Of course, as sub-node of options.dest.

Example: Write Exports Identifier for JS File

Assume you want to generate a JS file with an exports string which gets an identifier. We reuse the YAML file from above:

foo: bar

using this command:

$ jyt foo.yaml foobar.js -x foobar

This generates the following output in JS file using foobar as identifier:

module.exports.foobar = {
    foo: "bar"
}

NOTE: the identifier must be a valid JS identifier accoding to ECMAScript 6 (see also Valid JavaScript variable names in ECMAScript 6 and Generating a regular expression to match valid JavaScript identifiers).

Example: Force Overwriting

IMPORTANT NOTE: when using this feature then any subsequent execution which uses the same target/file name, will overwrite the original source or target created beforehand!

By default this feature is not enabled to prevent you from accidentally overwriting your input source or already generated targets.

But let's say we want to overwrite the original source now because you want to change the indention from 2 to 4 SPACEs, then we can do this as follows:

$ jyt foo.js -f

Of course, leaving out the -f switch creates a new file relatively to the origin, named as foo(1).js (note the consecutive number). Naturally, another run of the command would result in a file called foo(2).js and so forth.

Origin and Target Type Inference

The examples above have shown that we have an automatic type inference from file extensions. This is supported as shown by the following table (from-to):

| File Extension | Type | | --- | --- | | *.yaml | yaml | | *.yml | yaml | | *.js | js | | *.json | json |

NOTE: if you have files without an extension or e.g. *.txt you have to specify the origin or target type!

API Usage

Since the usage on CLI is a 2-step process:

  1. Read from source file to JS object ⇒ 2. Write out (maybe to other type)

the direct API calls additionally provide the usage of a middleware function where you can alter the input JS object before it is written and therefore, which turns this into a 3-step process:

  1. Read from source ⇒ 2. Alter the JS object ⇒ 3. Write out (maybe to other type)

For more details about this and all the functions provided by this module please refer to the API Reference.

The origin and target type inference is also standard for the API level.

API Properties

The Transformer exposes the following function which takes besides an (optional) middleware function the necessary options for the transformation:

function transform(options, middleware) {
    //...
}

The options object has to follow this key-values table:

| Option | Type | Description | Default | Required | | --- | --- | --- | --- | --- | | origin | string | The origin type. | If not given, the type is tried to be inferred from the extension of source path, else it is yaml. | no | | target | string | The target type. | If not given, the type is tried to be inferred from the extension of destination path, else it is js | no | | src | string | Readable | object | The source information object: string is used as file path, Readable stream provides a stringified source and object is used as direct JS source. | - | yes | | dest | string | Writable | object | The destination information object: string is used as file path, Writable stream writes a stringified source and object is used as direct JS object for assignment. | The output file is stored relative to the input file (same base name but with another extension if type differs). If input and output type are the same then the file overwriting is handled depending on the 'force' value! | no | | indent | number | The indention in files. | 4 | no | | force | boolean | Force overwriting of existing output files on write phase. When files are not overwritten, then the next transformation with same output file name gets a consecutive number on the base file name, e.g. in case of foo.yaml it would be foo(1).yaml. | false | no | | imports | string | Define a module.exports[.identifier] = ... identifier (to read from JS source only, must be a valid JS identifier!) | undefined | no | | exports | string | Define a module.exports[.identifier] = ... identifier (for usage in JS destination only, must be a valid JS identifier!) | undefined | no |

NOTE: an invalid indention setting (1 > indent > 8) does not raise an error but a default of 4 SPACEs is applied instead.

Example

var options = {
    origin: 'json',
    target: 'yaml',
    src: 'foo.json',
    dest: './foo/bar.yaml',
    indent: 2
}

Using Middleware

The middleware is optional but if provided it must be of type Function and a Promise. One of the easiest ones is the identity function

f(data) → data

which could be expressed as Promise function as follows:

var identity = function (data) {
    return Promise.resolve(data);
}

Of course, this would have no effect on the provided JS data. Actually, this one is used internally when no middleware is provided to ensure the proper promised control flow.

OK, lets go back to a more practical example, e.g. we want to alter the value of JS property before it is written to a file. Assuming we have this piece of YAML object as input:

foo: old bar

Applying this Promise as middleware

var middleware = function (data) {
    data.foo = 'new bar';
    return Promise.resolve(data);
}

transformer.transform(options, middleware)
    .then(function (msg){
        logger.info(msg);
    })
    .catch(function (err) {
        logger.error(err.stack);
    });

will result in such JSON file:

{
    "foo": "new bar"
}

Of course, in real world scenarios you will have use cases which usually have a higher complexity where one function might be insufficient or at least inconvenient. but this does not raise a problem at all, because you can create several functions to be applied in the whole transformation process by gathering them in one function.

Let's assume we have some Promise functions to apply. For simplicity reasons we simulate these for the moment by some functions, each adding a key-value to the given (initially empty) JS object.

NOTE: each of them has to resolve with the data object!

function key1(data) {
    objectPath.set(data, 'key1', 'value1');
    return Promise.resolve(data);
}

function key2(data) {
    objectPath.set(data, 'key2', 'value2');
    return Promise.resolve(data);
}

function key3(data) {
    objectPath.set(data, 'key3', 'value3');
    return Promise.resolve(data);
}

These can be collected by different aggregation or composition functions of the underlying Promise framework, e.g. using the Promise.all([...]) function. This one can collect all three functions above and ensure their proper subsequent execution:

var middleware = function (data) {
    return Promise.all([key1(data), key2(data), key3(data)])
        .then(function(result) {
            return Promise.resolve(result[result.length - 1]);
        });
};

var logger = new Logger();
var transformer = new Transformer(logger);
var options = {
   src: {}
};

return transformer.transform(options, middleware)
    .then(function (msg){
        logger.info(msg);
    })
    .catch(function (err) {
        logger.error(err.stack);
    });

Then the result in the middleware function can be retrieved from the returned array, i.e. in case of Promise.all([...]) you have to pick the last element which contains the "final product".

From our example above it would be result in this object

{
    key1: 'value1',
    key2: 'value2',
    key3: 'value3'
}

which then is passed back to the transformation chain. Following this pattern you can do almost everything with the JS object, like

  • deleting properties
  • changing properties to other types
  • validating and throwing error if not valid
  • ...

Whatever you do during transformation, just keep it valid ;-)

Using Custom Logger

It is usual that you use an own logger in your application. This module supports you by letting you inject your logger as constructor argument: the Reader, Transformer and Writer constructor will accept an (optional) logger object.

If you do not provide one, then the default logger is console.

var logger = ...;

var reader = new Reader(logger);
var transformer = new Transformer(logger);
var writer = new Writer(logger);

At least, the passed logger object has to support the following functions:

function info(msg)
function debug(msg)
function trace(msg)
function error(err|msg)

Anyway, there are some fallbacks if a level is not supported:

  • DEBUG ⇒ INFO
  • TRACE ⇒ DEBUG

API Reference

For more details on how to use the API, please refer to the API Reference wiki which describes the full API and provides more examples.

Contributing

Pull requests and stars are always welcome. Anybody is invited to take part into this project. For bugs and feature requests, please create an issue. See the wiki Contributing section for more details about conventions.

Changelog

v2.0.1

  • [#39] Maintenance release
  • Update dependencies to latest
  • Add travis build for Node.js v7.x and v6.x
  • Docs improved/corrected
  • Add target pretest in scripts section to rm ./test/tmp folder

v2.0.0

  • [#33] Enhance LogWrapper with TRACE level (API)
  • [#32] Introduce input and output on CLI as ARGS instead of OPTIONS (non-backwards compatible change for CLI usage, no impact on API level!)
  • e.g. on CLI type in $ jyt foo.js bar.yaml instead of $ jyt -s foo.js -d bar.yaml
  • [#31] Bugfix: given Object source results in 'yaml' for origin (API)
  • [Cleanup] Update dependencies

v1.0.2

  • [#30] Fix README and externalize API reference to wiki
  • [#29] Fix Promise warning on write process

v1.0.1

Initial public release. This covers the basic implementation and tests. The following features and fixes and part of this release:

  • [#27] Export variable for JS input
  • [#22] Integrate Coveralls
  • [#21] Check and fix CodeClimate issues
  • [#20] Cleanup test dir
  • [#19] File overwrite switch (-f, -force)
  • [#18] Read and Write from other sources than file path
  • [#16] ERROR: Error: Invalid target option found while creating destination file extension
  • [#15] Measure test code coverage and add a badge
  • [#12] Create middleware collection file to use by clients and internally
  • [#11] Check all Promises for optimization possibilities
  • [#10] Integrate project with Travis
  • [#9] Resolve origin and target from file extension whenever possible
  • [#8] Enable JS reading with require(...)
  • [#7] YAML indent is not set to Constants.MIN_YAML_INDENT when indent is set to 0
  • [#6] Finish full JSDoc for all methods
  • [#5] Write unit tests
  • [#4] Export variable for JS output
  • [#3] Promise array as middleware solved with Promise.all([...])