diff --git a/README.md b/README.md index 29b5b37..7e7cef5 100644 --- a/README.md +++ b/README.md @@ -6,7 +6,7 @@ GitHub Micrograd++ is a minimalistic wrapper around NumPy which adds support for automatic differentiation. -Designed as a learning tool, Micrograd++ provides an accessible entry point for those interested in understanding automatic differentiation and backpropagation or seeking a clean, educational resource. +It also provides various composable classes ("layers") and other tools to simplify building neural networks. Micrograd++ draws inspiration from Andrej Karpathy's awesome [micrograd](https://github.com/karpathy/micrograd) library, prioritizing simplicity and readability over speed. Unlike micrograd, which tackles scalar inputs, Micrograd++ supports tensor inputs (specifically, NumPy arrays).