Skip to content

xmuruaga/micrograd

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

micrograd

Implementation of Andrej Karpathy's micrograd

micrograd is a minimalistic autograd engine that implements backpropagation (reverse-mode autodiff) over a dynamically built Directed Acyclic Graph (DAG) and a small neural networks library on top of it with a PyTorch-like API. Despite its simplicity, with about 100 and 50 lines of code for the DAG and neural network library respectively, it's capable of building entire deep neural nets for binary classification, as demonstrated in the demo notebook. This project might be particularly useful for educational purposes, providing insight into the inner workings of neural networks and backpropagation.

Modeling one Neuron:

neuron_model

Neural Network architecture:

neural_net2

About

Implementation of Andrej Karpathy's micrograd

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors