An educational NumPy-based autodiff engine and neural network library. See my blog post for a breakdown of the key logic.
pip install microgradpp
The mgp.np module contains the primary NumPy engine with the code from the blog post, while mgp.vanilla contains a baseline scalar engine similar to micrograd.
- See
scripts/train_mnist.py, where we usemgp.npto train a convolutional neural network for MNIST image classification. With the stated hyperparameters, it achieves an accuracy of 0.97+. scripts/train_xor.pyruns MLPs built on both the vanilla and NumPy-based engines on the XOR task. The NumPy-based engine trains several times faster.
In general, I tried to make the API as PyTorch-like as possible while keeping it simple.
