top | item 44149255

(no title)

bwest87 | 9 months ago

I did a chat with Gemini about the paper, and tldr is... * They introduce a loop at the beginning between Q, K, and V vectors (theoretically representing "question", "clues" and "hypothesis" of thinking) * This loop contains a non linearity (ReLU) * The loop is used to "pre select" relevant info * They then feed that into a light weight attention mechanism.

They claim OOM faster learning, and robustness acro domains. There's enough detail to probably do your own PuTorch implementation, though they haven't released code. The paper has been accepted into AMLDS2025. So peer reviewed.

At first blush, this sounds really exciting and if results hold up and are replicated, it could be huge.

discuss

order

No comments yet.