top | item 46845510

(no title)

willx86 | 29 days ago

( all of this math is approximate) https://stackoverflow.com/questions/62491720/in-latency-valu...

Bear in mind this is: - 5 years old - only cpu

If you'd do this on a gaming laptop, it'd all be on SSDs, which are orders of magnitude slower than GPU's for memory access

Also, AI uses maths, called FLOPS, floating point operations

My laptop cpu (7840U) has 4.1TFLOPS, a H200 GPU has 3,958 TFLOPS

OpenAI chatgpt 5 was reportedly trained on ~100-200k nvidia GPU's

So: - accessing data is 1000x slower - maths is 1000x slower - they have up to 200,000x more GPU's than a laptop

Now remember each part of the data is used multiple times, you start getting into the GPU's being 1000x1000x200,000x( data access multiple times) faster

So, I don't think there's fundamentally something impossible with training claude opus on your laptop, but moreso the time required would be so infinitely high that it's very improbable.

discuss

order

No comments yet.