Skip to content

Commit

Permalink
Update README
Browse files Browse the repository at this point in the history
  • Loading branch information
parsiad committed Aug 30, 2024
1 parent fde6612 commit 4e20a98
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
<a href="https://github.com/parsiad/micrograd-pp"><img alt="GitHub" src="https://img.shields.io/badge/github-%23121011.svg?logo=github"></a>

Micrograd++ is a minimalistic wrapper around NumPy which adds support for automatic differentiation.
Designed as a learning tool, Micrograd++ provides an accessible entry point for those interested in understanding automatic differentiation and backpropagation or seeking a clean, educational resource.
It also provides various composable classes ("layers") and other tools to simplify building neural networks.

Micrograd++ draws inspiration from Andrej Karpathy's awesome [micrograd](https://github.com/karpathy/micrograd) library, prioritizing simplicity and readability over speed.
Unlike micrograd, which tackles scalar inputs, Micrograd++ supports tensor inputs (specifically, NumPy arrays).
Expand Down

0 comments on commit 4e20a98

Please sign in to comment.