top | item 47137498

(no title)

152334H | 6 days ago

holy crap, this is so good. How did it get buried?

discuss

order

nee1r|5 days ago

real

sheepscreek|4 days ago

Are you guys affiliated with Meta’s ex-CTO in any way? I remember he famously implied that LLMs hyped. The demos are very impressive. Does this use an attention based mechanism too? Just trying to understand (as a layman) how these models handle context and if long contexts lead to weaker results. Could be catastrophic in the real world!