Mind blow by NotebookLM generating podcast on LLM Sparsity
1 points| nrjpoddar | 8 months ago |open.spotify.com
Inputs: Our GitHub repo ( link in comments) Research papers: Deja Vu & LLM in a Flash A Reddit thread rich in community commentary
The output was pure magic
A clean, cogent podcast that distills all of it - sparsity, memory access, retrieval patterns into something even non-ML researchers can grasp.
nrjpoddar|8 months ago
Leynos|8 months ago