top | item 46483776

MyTorch – Minimalist autograd in 450 lines of Python

100 points| iguana2000 | 1 month ago |github.com

19 comments

order

brandonpelfrey|1 month ago

Having written a slightly more involved version of this recently myself I think you did a great job of keeping this compact while still readable. This style of library requires some design for sure.

Supporting higher order derivatives was also something I considered, but it’s basically never needed in production models from what I’ve seen.

iguana2000|1 month ago

Thanks! I agree about the style

jerkstate|1 month ago

Karpathy’s micrograd did it first (and better); start here: https://karpathy.ai/zero-to-hero.html

alkh|1 month ago

Imho, we should let people experiment as much as they want. Having more examples is better than less. Still, thanks for the link for the course, this is a top-notch one

iguana2000|1 month ago

Karpathy's material is excellent! This was a project I made for fun, and hopefully provides a different perspective on how this can look

khushiyant|1 month ago

Better readme would be way to go

CamperBob2|1 month ago

In iguana2000's defense, the code is highly self-documenting.

It arguably reads cleaner than Karpathy's in some respects, as he occasionally gets a little ahead of his students with his '1337 Python skillz.