npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@piman51277/backprop

v1.0.0

Published

An optimized implementation of the backpropagation algorithm in Typescript

Downloads

5

Readme

backprop

An optimized implementation of the backpropagation algorithm in Typescript

Example Usage

import { TrainingData, Net } from "./Net";

//example dataset
const XOR: TrainingData = [
  [[0, 0], [0]],
  [[0, 1], [1]],
  [[1, 0], [1]],
  [[1, 1], [0]],
];

//create a net with random weights/biases
const net = Net.create(2, 1, 3, 1);

//train the net
const trainConfig = {
  gamma: 0.1, //learning rate of weights
  gamma_b: 0.1, //learning rate of biases
  momentum: 0.1, //momentum of weights/biases
  batchSize: 1, //batch size
};

console.log(net.errorDataset(XOR)); //print the average error of the net
net.train(XOR, 100000, trainConfig); //train the net for 100000 epochs
console.log(net.errorDataset(XOR)); //print the new average error of the net

Documentation

Table of Contents

Net

Net.prototype

Misc

Net

Net(options)

source | TOC
Constructor for the Net class.

Arguments

  • options {NetOptions} - The weights and biases of the net.

Net.create(inputs, outputs, hidden, hiddenLayers)

source | TOC
Creates a new net with the given dimensions.

Arguments

  • inputs {number}: The size of the input layer.
  • outputs {number}: The size of the output layer.
  • hidden {number}: The size of the hidden layer.
  • hiddenLayers {number}: The number of hidden layers.

Returns

{Net}: An instance of the Net class, with random weights and biases.

Net.mergeGradients(gradients)

source | TOC
Average several gradients into one.

Arguments

  • gradients {Gradient[]}: The gradients to average.

Returns

{Gradient}: The averaged gradient.

Net.prototype

Net.prototype.eval(input)

source | TOC
Run a forward pass of the net using the given input.

Arguments

  • input {number[]}: The input to the net.

Returns

{number[][]}: The state of the net after the input has been passed.

Net.prototype.gradient(input, expected)

source | TOC
Calculate the required gradient for a given test case

Arguments

  • input {number[]}: The input to the net.
  • expected {number[]}: The expected output of the net.

Returns

{Gradient}: The gradient of the net.

Net.prototype.apply(gradient [, gamma=0.1, gamma_b=0.1])

source | TOC
Apply a gradient to the net.

Arguments

  • gradient {Gradient[]}: The gradient to apply.
  • gamma {number}: [Optional] The learning rate of the weights.
  • gamma_b {number}: [Optional] The learning rate of the biases.

Net.prototype.train(dataset, epochs, options)

source | TOC
Train the net using the given dataset.

Arguments

  • dataset {TrainingData[]}: The dataset to train the net on.
  • epochs {number}: The number of epochs to train the net for.
  • options {TrainingOptions}: The options for training.

Net.prototype.error(input, expected)

source | TOC
Calculate the error of the net for a given test case using MSE

Arguments

  • input {number[]}: The input to the net.
  • expected {number[]}: The expected output of the net.

Returns

{number}: The error of the net.

Net.prototype.errorDataset(dataset)

source | TOC
Calculate the average error of the net for a given dataset.

Arguments

Returns

{number}: The average error of the net.

Net.prototype.debug()

source | TOC
Prints the state of the net to the console.

Format

Weights:
<Layer0 Weight0>, <Layer0 Weight1>, ..., <Layer0 WeightN>
<Layer1 Weight0>, <Layer1 Weight1>, ..., <Layer1 WeightN>
...
<LayerM Weight0>, <LayerM Weight1>, ..., <LayerM WeightN>
Biases:
<Layer0 Bias0>, <Layer0 Bias1>, ..., <Layer0 BiasN>
<Layer1 Bias0>, <Layer1 Bias1>, ..., <Layer1 BiasN>
...
<LayerM Bias0>, <LayerM Bias1>, ..., <LayerM BiasN>

Misc

NetOptions

{
  weights: number[][];
  biases: number[][];
  dimensions: number[];
};

An object describing the weights and biases of a net.

Gradient

[
    number[][], //bias gradient
    number[][] //weight gradient
]

An object describing the gradient of a net.

TrainingData

[
    [number[], number[]], //input, expected output
    [number[], number[]],
    [number[], number[]],
    ...
]

An array of training data.

TrainingOptions

{
  gamma?: number; //learning rate of the weights
  gamma_b?: number; //learning rate of the biases
  momentum?: number; //momentum of the weights/biases
  batchSize?: number; //the size of the batch
}

Notes

Weight Configuration

Weights are internally stored as a 1D array, grouped by origin node. This is so the weights can be read sequentially, increasing the performance of the network.

Example

A diagram of a simple net

For this net, the weights would be stored as such:

[
  [0.4, 0.8, 0.52, 0.3], //Layer 0
  [0.1, 0.63, 0.48, 0.2], //Layer 1
];