npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

microgradts

v2.5.0

Published

A tiny Autograd engine based off of Andrej Karpathy's micrograd in Python

Downloads

38

Readme

MicrogradTS

seal

Video Demo

Description:

MicrogradTS is a tiny autograd engine based on Andrej Karpathy's Micrograd in Python. Autograd is a technique that allows you to automatically compute the gradients of any function with respect to its inputs, which is useful for optimization and machine learning. MicrogradTS implements backpropagation over a dynamically built directed acyclic graph (DAG) that operates over scalar values. This means that you can define complex functions using basic arithmetic operations and then get the derivatives of those functions with respect to any variable.

Features

Some of the features of MicrogradTS are:

  • Supports basic arithmetic operations such as addition, subtraction, multiplication, division and power
  • Supports unary functions such as exp
  • Supports loss functions such as mean squared error (MSE)
  • Supports neural network models such as multilayer perceptron (MLP)
  • Supports automatic differentiation of any user-defined function
  • Supports visualization of the computation graph with Graphviz (TODO)

Installation

You can install MicrogradTS using npm or yarn:

npm install microgradts
yarn add microgradts

You will also need to install Graphviz if you want to visualize the computation graph (TODO).

Example usage

Below is a slightly contrived example showing a number of possible supported operations:

import { Value, add, mul, pow, div } from 'microgradts'

const a = new Value(4.0)
const b = new Value(-2.0)
const c = add(a, b)
const d = add(mul(a, b), pow(b, new Value(3)))
const e = new Value(3.0)
const f = div(d, e)
f.backward(); // compute the gradient of f with respect to all the variables in the graph
console.log(f.data); // print the data of f: -5.3333333333333333
console.log(a.grad); // print the gradient of f with respect to a: -0.6666666666666666
console.log(b.grad); // print the gradient of f with respect to b: 5.333333333333333

And an example usage of the Neural Net API:

const n = new MLP(3, [4, 4, 1]); // create a multilayer perceptron model with 3 input units, 2 hidden layers with 4 units each, and 1 output unit

const xs = [
  [2.0, 3.0, -1.0],
  [3.0, -1.0, 0.5],
  [0.5, 1.0, 1.0],
  [1.0, 1.0, -1.0],
].map((x) => toValues(x)); // create an array of input values from an array of numbers
const ys = toValues([1.0, -1.0, -1.0, 1.0]); // create an array of output values from an array of numbers

for (let i = 0; i < 200; i++) { // train the model for 200 iterations
  const ypred = xs.map((x) => n.run(x)); // run the model on each input and get an array of predictions
  const loss = getLoss(ys, ypred as Value[]); // compute the mean squared error loss between the predictions and the outputs

  for (const p of n.parameters()) { // loop over all the parameters of the model
    p.grad = 0; // reset their gradients to zero
  }
  loss.backward(); // compute the gradient of the loss with respect to all the parameters

  for (const p of n.parameters()) { // loop over all the parameters of the model
    p.data -= 0.01 * p.grad; // update their data by subtracting a small fraction of their gradients
  }

  console.log(i, loss.data); // print the iteration number and the loss value
}

Todo

  • Implement visualization with Graphviz: This feature will allow you to generate a graphical representation of the computation graph using the Graphviz library. This will help you debug and understand your functions and models better.

License

MIT: This project is licensed under the MIT License, which means that you can use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the software, as long as you include the original license notice and disclaimer in any copy. See the LICENSE file for more details.