top | item 39738588

(no title)

leifross | 1 year ago

GPT3 released four years ago, in terms of iterations beyond that, we are at ~v5, and progress has only been incremental relative to that milestone. The transformer models can only be scaled so far before not even VC money can sustain training. I believe we will get there eventually, but transformer based LLMs have been hitting a roof for a long time, and we need to think differently.

discuss

order

No comments yet.