top | item 41974820

(no title)

kbr | 1 year ago

NNs have complex non-convex loss functions that don't admit a closed-form solution. Even for small models, it can be shown that it's an NP-complete problem. In fact, even for linear regression (least squares), which has a closed-form solution, it can be computationally cheaper to run gradient descent since finding the closed form solution requires you to calculate and invert a large matrix (X^T X).

discuss

order

rachofsunshine|1 year ago

Which in some sense is intuitive: any closed form that can model general computation to any significant degree should be hard: if it weren't, you could encode your NP-complete problem into it, solve it in an efficient closed form, and collect your Fields medal for proving P = NP.

quantadev|1 year ago

Intuition is often wrong, even for high IQ people, like your average HN user. lol.

For a long time it was intuitive that you cannot find the area under arbitrary functions, but then Calculus was invented, showing us a new "trick", that was previously unfathomable, and indistinguishable from magic.

I'm just not sure mankind's understanding of Mathematics is out of new "tricks" to be learned. I think there are types of algorithms today that look like the require N-iterations to get X-precision, when in reality we might be able to divide N by some factor, for some algorithms, and still end up with X-precision.

quantadev|1 year ago

Thanks for that great clarification. I had seen all those words before, but just not in that particular order. haha.

Maybe our only hope of doing LLM training runs in a tiny amount of time will be from Quantum Computing or even Photonic (wave-based) Computing.