top | item 46887434

(no title)

fheinsen | 26 days ago

As the error via linear approximation approaches similar magnitude as numerical error via quadratic computation, don’t the two start becoming comparable in practice?

I ask because in practice, for inference, attention is typically computed with low-precision (4-bit, 8-bit, 16-bit) floats.

Numerical error, in fact, may be a key factor as to why quadratic attention, in practice, exhibits context rot as context gets longer, analogous to an RNN:

https://www.anthropic.com/engineering/effective-context-engi...

discuss

order

cubefox|25 days ago

That website says nothing about numerical error potentially causing context rot.

fheinsen|25 days ago

As far as I know, there is no widely accepted explanation for context rot.

Numerical error in long sequences of query-key dot-products may be a key factor.