npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

gltf-pipeline-fork

v4.0.3

Published

Content pipeline tools for optimizing glTF assets.

Downloads

7

Readme

glTF Pipeline

License Build Status

Content pipeline tools for optimizing glTF assets by Richard Lee and the Cesium team.

Supports common operations including:

  • Converting glTF to glb (and reverse)
  • Saving buffers/textures as embedded or separate files
  • Converting glTF 1.0 models to glTF 2.0
  • Applying Draco mesh compression

gltf-pipeline can be used as a command-line tool or Node.js module.

Getting Started

Install Node.js if you don't already have it, and then:

npm install -g gltf-pipeline

Using gltf-pipeline as a command-line tool:

Converting a glTF to glb

gltf-pipeline -i model.gltf -o model.glb

gltf-pipeline -i model.gltf -b

Converting a glb to glTF

gltf-pipeline -i model.glb -o model.gltf

gltf-pipeline -i model.glb -j

Converting a glTF to Draco glTF

gltf-pipeline -i model.gltf -o modelDraco.gltf -d

Saving separate textures

gltf-pipeline -i model.gltf -t

Using gltf-pipeline as a library:

Converting a glTF to glb:

const gltfPipeline = require("gltf-pipeline");
const fsExtra = require("fs-extra");
const gltfToGlb = gltfPipeline.gltfToGlb;
const gltf = fsExtra.readJsonSync("./input/model.gltf");
const options = { resourceDirectory: "./input/" };
gltfToGlb(gltf, options).then(function (results) {
  fsExtra.writeFileSync("model.glb", results.glb);
});

Converting a glb to embedded glTF

const gltfPipeline = require("gltf-pipeline");
const fsExtra = require("fs-extra");
const glbToGltf = gltfPipeline.glbToGltf;
const glb = fsExtra.readFileSync("model.glb");
glbToGltf(glb).then(function (results) {
  fsExtra.writeJsonSync("model.gltf", results.gltf);
});

Converting a glTF to Draco glTF

const gltfPipeline = require("gltf-pipeline");
const fsExtra = require("fs-extra");
const processGltf = gltfPipeline.processGltf;
const gltf = fsExtra.readJsonSync("model.gltf");
const options = {
  dracoOptions: {
    compressionLevel: 10,
  },
};
processGltf(gltf, options).then(function (results) {
  fsExtra.writeJsonSync("model-draco.gltf", results.gltf);
});

Saving separate textures

const gltfPipeline = require("gltf-pipeline");
const fsExtra = require("fs-extra");
const processGltf = gltfPipeline.processGltf;
const gltf = fsExtra.readJsonSync("model.gltf");
const options = {
  separateTextures: true,
};
processGltf(gltf, options).then(function (results) {
  fsExtra.writeJsonSync("model-separate.gltf", results.gltf);
  // Save separate resources
  const separateResources = results.separateResources;
  for (const relativePath in separateResources) {
    if (separateResources.hasOwnProperty(relativePath)) {
      const resource = separateResources[relativePath];
      fsExtra.writeFileSync(relativePath, resource);
    }
  }
});

Command-Line Flags

| Flag | Description | Required | | ------------------------------ | ------------------------------------------------------------------------------------------------------------------------------------------------------------ | ----------------------------------------- | | --help, -h | Display help | No | | --input, -i | Path to the glTF or glb file. | :white_check_mark: Yes | | --output, -o | Output path of the glTF or glb file. Separate resources will be saved to the same directory. | No | | --binary, -b | Convert the input glTF to glb. | No, default false | | --json, -j | Convert the input glb to glTF. | No, default false | | --separate, -s | Write separate buffers, shaders, and textures instead of embedding them in the glTF. | No, default false | | --separateTextures, -t | Write out separate textures only. | No, default false | | --stats | Print statistics to console for output glTF file. | No, default false | | --keepUnusedElements | Keep unused materials, nodes and meshes. | No, default false | | --keepLegacyExtensions | When false, materials with KHR_techniques_webgl, KHR_blend, or KHR_materials_common will be converted to PBR. | No, default false | | --draco.compressMeshes, -d | Compress the meshes using Draco. Adds the KHR_draco_mesh_compression extension. | No, default false | | --draco.compressionLevel | Draco compression level [0-10], most is 10, least is 0. A value of 0 will apply sequential encoding and preserve face order. | No, default 7 | | --draco.quantizePositionBits | Quantization bits for position attribute when using Draco compression. | No, default 11 | | --draco.quantizeNormalBits | Quantization bits for normal attribute when using Draco compression. | No, default 8 | | --draco.quantizeTexcoordBits | Quantization bits for texture coordinate attribute when using Draco compression. | No, default 10 | | --draco.quantizeColorBits | Quantization bits for color attribute when using Draco compression. | No, default 8 | | --draco.quantizeGenericBits | Quantization bits for skinning attribute (joint indices and joint weights) and custom attributes when using Draco compression. | No, default 8 | | --draco.unifiedQuantization | Quantize positions of all primitives using the same quantization grid. If not set, quantization is applied separately. | No, default false | | --draco.uncompressedFallback | Adds uncompressed fallback versions of the compressed meshes. | No, default false | | --baseColorTextureNames | Names of uniforms that should be considered to refer to base color textures when updating from the KHR_techniques_webgl extension to PBR materials. | No. (The defaults are not specified here) | | --baseColorFactorNames | Names of uniforms that should be considered to refer to base color factors when updating from the KHR_techniques_webgl extension to PBR materials. | No. (The defaults are not specified here) |

Build Instructions

Run the tests:

npm run test

To run ESLint on the entire codebase, run:

npm run eslint

To run ESLint automatically when a file is saved, run the following and leave it open in a console window:

npm run eslint-watch

Building for CesiumJS integration

Some functionality of gltf-pipeline is used by CesiumJS as a third party library. The necessary files can be generated using:

npm run build-cesium

This will output a portion of the gltf-pipeline code into the dist/cesium folder for use with CesiumJS in the browser. Copy the files into Source/Scene/GltfPipeline/ in the cesium repository and submit a pull request.

Running Test Coverage

Coverage uses nyc. Run:

npm run coverage

For complete coverage details, open coverage/lcov-report/index.html.

The tests and coverage covers the Node.js module; it does not cover the command-line interface, which is tiny.

Generating Documentation

To generate the documentation:

npm run jsdoc

The documentation will be placed in the doc folder.

Contributions

Pull requests are appreciated! Please use the same Contributor License Agreement (CLA) and Coding Guide used for Cesium.