The title of this paper is a reference to a previous paper titled "Attention Is All You Need"[0][1]. This seminal work described the transformer model that is the basis for almost all LLMs, and is almost certainly the most cited paper on AI even though it was only published in 2017.
insin|9 months ago
ghc|9 months ago
tankenmate|9 months ago
[0] https://arxiv.org/abs/1706.03762 [1] https://en.wikipedia.org/wiki/Attention_Is_All_You_Need
kristopolous|9 months ago
People seem to love going to the references graveyard, digging up tired and dead ones and drag them around town hoping everyone thinks they're clever.
Also this was from 3 months ago.
netdevphoenix|9 months ago
amelius|9 months ago
jsheard|9 months ago
seeknotfind|9 months ago
Etheryte|9 months ago
0xdeadbeefbabe|9 months ago
EGreg|9 months ago