Are the gradient visualizations not doing it for you?
Of course it kind of breaks down as the gradient can no longer be visualized as an arrow in 2D or 3D space and not all concepts transfer as easily to higher dimensions, as one would hope, but some do.
The PyTorch3D section was genuinely useful for me. I've been doing 2D ML work for a while but hadn't explored 3D deep learning — didn't even know PyTorch3D existed until this tutorial.
What worked well was the progressive complexity. Starting with basic mesh rendering before jumping into differentiable rendering made the concepts click. The voxel-to-mesh conversion examples were particularly clear.
If anything, I'd love to see a follow-up covering point cloud handling, since that seems to be a major use case based on the docs I'm now digging through.
Thanks for writing this — triggered a weekend deep-dive I probably wouldn't have started otherwise.
This does an honest good job of walking through the beginnings, I would still say understanding/decomposing a decision tree and going through the details and choices /trade offs one makes with how they prepare the tree like binary split or discrete/binning for continuous data. What reducing entropy means, etc. Maybe even start with parametric versus nonparametric modeling pros/cons. You really get to see how probability and statistics is applied in the formulas that eventually will be thrown into a dot function in python.
There is a lot of content on pytorch, which is great and makes a ton of sense since it's used so heavily, where the industry needs a ton of help/support in is really the fundamentals. Nonetheless, great contribution!
This was quite accessible. If I had to pick one point, I wish there was more "handholding" from gradient to gradient-descent i.e. in the style of the math-focused introduction of the function with one parameter, two parameters etc that was done. It felt a bit of sudden jump from the math to the code. I think the gentle introduction to the math is very valuable here.
Are there other similar tutorials like this going into fundamentals of model architectures for example? Something like https://poloclub.github.io/cnn-explainer/ for example
Interesting article. It would be really useful if you have added a full article title to the page meta data, so it would get bookmarked with title. I assume one does not require GPU to try out simple examples provided?
Very nice, thanks! It’s great to be able to play with viz!
For a deeper tutorial, I highly recommend PyTorch for Deep Learning Professional Certificate on deeplearning.ai — probably one of the best mooc I’ve seen so far
Thank you so much. Really appreciate the thoughtful feedback!
I've watched many intros. Somehow they always end with 90%+ accuracy and that was just not my experience while learning on datasets I picked myself. I remember spending hours tuning different parameters and not quite understanding why I was getting way worse accuracy. I showed this intentionally, and I'm glad you commented on this!
simonw|13 days ago
https://0byte.io/articles/neuron.html
https://0byte.io/articles/helloml.html
He also publishes to YouTube where he has clear explanations and high production values that deserve more views.
https://www.youtube.com/watch?v=dES5Cen0q-Y (part 2 https://www.youtube.com/watch?v=-HhE-8JChHA) is the video to accompany https://0byte.io/articles/helloml.html
knickerbockeroo|13 days ago
pjmlp|12 days ago
Yet, 2D and 3D graphics feel relatively natural, maybe because at least I can visualize that kind of math.
KeplerBoy|12 days ago
Of course it kind of breaks down as the gradient can no longer be visualized as an arrow in 2D or 3D space and not all concepts transfer as easily to higher dimensions, as one would hope, but some do.
tl2do|13 days ago
What worked well was the progressive complexity. Starting with basic mesh rendering before jumping into differentiable rendering made the concepts click. The voxel-to-mesh conversion examples were particularly clear.
If anything, I'd love to see a follow-up covering point cloud handling, since that seems to be a major use case based on the docs I'm now digging through.
Thanks for writing this — triggered a weekend deep-dive I probably wouldn't have started otherwise.
patrick451|13 days ago
slashtom|12 days ago
There is a lot of content on pytorch, which is great and makes a ton of sense since it's used so heavily, where the industry needs a ton of help/support in is really the fundamentals. Nonetheless, great contribution!
lappa|13 days ago
0bytematt|13 days ago
unknown|12 days ago
[deleted]
noisy_boy|12 days ago
jcattle|12 days ago
butz|12 days ago
0bytematt|12 days ago
alkh|13 days ago
0bytematt|13 days ago
trcf23|13 days ago
For a deeper tutorial, I highly recommend PyTorch for Deep Learning Professional Certificate on deeplearning.ai — probably one of the best mooc I’ve seen so far
https://www.deeplearning.ai/courses/pytorch-for-deep-learnin...
butanyways|12 days ago
Free book: https://zekcrates.quarto.pub/deep-learning-library/
Ml by hand : https://github.com/workofart/ml-by-hand
Micrograd: https://github.com/karpathy/micrograd
gukov|12 days ago
0bytematt|12 days ago
SomaticPirate|13 days ago
rwarren63|12 days ago
hnrodey|12 days ago
SilentM68|13 days ago
lgas|12 days ago
unknown|13 days ago
[deleted]
puppion|13 days ago
[deleted]
0bytematt|13 days ago
I've watched many intros. Somehow they always end with 90%+ accuracy and that was just not my experience while learning on datasets I picked myself. I remember spending hours tuning different parameters and not quite understanding why I was getting way worse accuracy. I showed this intentionally, and I'm glad you commented on this!
The XGBoost comparison is a great idea.
unknown|12 days ago
[deleted]