scratchy-grad
v2.1.3
Published
A simple, handrolled autograd implementation package with an MLP wrapper
Downloads
7
Readme
ScratchyGrad
(Because it's from scratch)
Live
Description
This project, inspired by Andrej Karpathy's micrograd and YouTube series, features an autograd implementation in plain Typescript with no dependencies.
TODO
- [x] Refactor the model API towards pytorch's (explicitly passing in layers, optimizer object...)
- [x] Export autograd + MLP wrapper as a package
- Determine fix for adding more than one layer - currently multiple layers result in exploding gradients
- [x] Implement Batch Normalization
- [ ] Ensure gradients are healthy looking
- [ ] Add Embedding layer