top | item 45691732

(no title)

Xcelerate | 4 months ago

Haha, I like to joke that we were on track for the singularity in 2024, but it stalled because the research time gap between "profitable" and "recursive self-improvement" was just a bit too long that we're now stranded on the transformer model for the next two decades until every last cent has been extracted from it.

discuss

order

ai-christianson|4 months ago

There's massive hardware and energy infra built out going on. None of that is specialized to run only transformers at this point, so wouldn't that create a huge incentive to find newer and better architectures to get the most out of all this hardware and energy infra?

Mehvix|4 months ago

>None of that is specialized to run only transformers at this point

isn't this what [etched](https://www.etched.com/) is doing?

Davidzheng|4 months ago

how do you know we're not at recursive self-improvement but the rate is just slower than human-mediated improvement?