npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

tractjs

v0.4.0

Published

A library for running ONNX and TensorFlow inference in the browser.

Downloads

2,117

Readme

tractjs

npm version Test Deploy to Github Pages

Run ONNX and TensorFlow inference in the browser. A thin wrapper on top of tract.

The Open Neural Network Exchange is a format which many popular libraries like PyTorch, TensorFlow and MXNet can export to which allows tractjs to run neural networks from (almost) any library.

Website | API Docs

Why tractjs instead of ONNX.js?

There is currently one other usable ONNX runner for the browser, ONNX.js. There are a couple of things tractjs does better:

  • tractjs supports more operators:
    • LSTMs (even bidirectional) are supported while ONNX.js does not support any recurrent networks.
    • Some ONNX-ML models like decision tree classifiers are also supported.
  • tractjs is more convenient to use. It can build to a single file tractjs.min.js which contains the inlined WASM and WebWorker. The WASM backend of ONNX.js can not as easily be used without a build system.

There are however also some downsides to tractjs. See the FAQ.

Getting started

Without a bundler

<html>
  <head>
    <meta charset="utf-8" />
    <script src="https://unpkg.com/tractjs/dist/tractjs.min.js"></script>
    <script>
      tractjs.load("path/to/your/model").then((model) => {
        model
          .predict([new tractjs.Tensor(new Float32Array([1, 2, 3, 4]), [2, 2])])
          .then((preds) => {
            console.log(preds);
          });
      });
    </script>
  </head>
</html>

With a bundler

npm install tractjs
import * as tractjs from "tractjs";

tractjs.load("path/to/your/model").then((model) => {
  model
    .predict([new tractjs.Tensor(new Float32Array([1, 2, 3, 4]), [2, 2])])
    .then((preds) => {
      console.log(preds);
    });
});

With Node.js

tractjs now runs in Node.js! Models are fetched from the file system.

const tractjs = require("tractjs");

tractjs.load("./path/to/your/model").then((model) => {
  model
    .predict([new tractjs.Tensor(new Float32Array([1, 2, 3, 4]), [2, 2])])
    .then((preds) => {
      console.log(preds);
    });
});

FAQ

Why does my model with dynamic input dimensions not work?

Currently, tract requires has some restrictions on dynamic dimensions. If your model has a dynamic dimension, there's multiple solutions:

  1. Declare a dynamic dimension via an input fact. Input facts are a way to provide additional information about input type and shape that can not be inferred via the model data:
const model = await tractjs.load("path/to/your/model", {
  inputFacts: {
    0: ["float32", [1, "s", 224, 224]],
  },
});
  1. Set fixed input dimensions via input facts. This is of course not ideal because subsequently the model can only be passed inputs with this exact shape:
const model = await tractjs.load("path/to/your/model", {
  inputFacts: {
    // be careful with image model input facts! here I use ONNX's NCHW format
    // if you are using TF you will probably need to use NHWC (`[1, 224, 224, 3]`).
    0: ["float32", [1, 3, 224, 224]],
  },
});
  1. Turn optimize off. This is the nuclear option. It will turn off all optimizations relying on information about input shape. This will make sure your model work (even with multiple dynamic dimensions) but significantly impact performance:
const model = await tractjs.load("path/to/your/model", {
  optimize: false,
});

What about size?

At the time of writing, tractjs is very large for web standards (6.2MB raw, 2.1MB gzipped). This is due to tract being quite large, and due to some overhead from inlining the WASM. But it's not as bad as it sounds. You can load tractjs lazily along your demo, where you will likely have to load significantly large weights too.

If you are working on a very size-sensitive application, get in touch and we can work on decreasing the size. There are some more optimizations to be done (e. g. an option not to inline WASM, and removing panics from the build). There is also ongoing work in tract to decrease size.

What about WebGL / WebNN support?

tractjs are bindings to the tract Rust library which was originally not intended to be run on the web. WebGL / WebNN support would be great, but would require lots of web-specific changes in tract so it is currently not under consideration.

License

Apache 2.0/MIT

All original work licensed under either of

  • Apache License, Version 2.0 (LICENSE-APACHE or http://www.apache.org/licenses/LICENSE-2.0)
  • MIT license (LICENSE-MIT or http://opensource.org/licenses/MIT) at your option.

Contribution

Contributions are very welcome! See CONTRIBUTING.md.