top | item 41712213

(no title)

leblancfg | 1 year ago

Here's a thought experiment. Think back on how that statement would have sounded like to past-you, 3 years ago. You would probably have dismissed it as bullshit, right? We've gone a long way since then. Both in terms of better, faster and cheaper models, but also how they're being intertwined with developer tooling.

Now imagine 3 years from now.

discuss

order

cj|1 year ago

You could have said the same for crypto/blockchain 3-4 years ago (or whenever it was at peak hype).

Eventually we realized what is and isn't possible or practical to use blockchain for. It didn't really live up to all the original hype years ago, but it's still a good technology to have around.

It's possible LLMs could follow a similar pattern, but who knows.

__loam|1 year ago

What good thing has blockchain ever done that isn't facilitating crime or tax evasion?

falcolas|1 year ago

As you inadvertently pointed out, AI improvements are not linear. They depend on new discoveries more than they do iteration. We could be in either out of jobs or lamenting the stagnation of AI (again).

leblancfg|1 year ago

After an innovation phase there is an implementation phase. Depending on the usefulness of the innovation, the integration with existing systems takes time. It is calculated in years, tens of years. Think back on the 80-90s, where it took years to integrate PCs into offices and workspaces.

From your comment, is sounds like you think that the implementation phase of LLMs is already over? And if so, how do you come to this conclusion?

skybrian|1 year ago

You can imagine all sorts of things, and then something else might happen. You can’t rely on “proof by imagination” or “proof by lack of imagination.”

We shouldn’t be highly confident in any claims about where AI will be in three years, because it depends on how successful the research is. Figuring out how to apply the technology to create successful products takes time, too.

tester756|1 year ago

Same thing that can be said about autonomous cars in 2014?

Not everything will grow exponentially forever

ryanackley|1 year ago

GPT-4 has been out for 1.5 years and I haven't seen much improvement in code quality across LLM's. It's still one of the best.

microtonal|1 year ago

Or you are extrapolating from the exponential growth phase in a sigmoid curve. Hard to say.

denismi|1 year ago

Ten years ago when Siri/Google/Alexa were launching, I really wouldn't have expected that 2024 voice assistants would be mere egg timers, and frustrating ones at that - requiring considered phrasing and regular repeating/cancelling/yelling to trick it into doing what you want.

A 10x near future isn't inconceivable, but neither is one where we look back and laugh at how hyped we got at that early-20s version of language models.

hyperG|1 year ago

It is a great point.

It also might be that the language everyone uses 20 years from now that gives a 50X from today is just being worked on right now or won't come along for another 5 years.

The way people who would have thought that humans could never fly were not completely wrong before the airplane. After the airplane though, we are really talking about two different versions of a "human that can fly".