top | item 46129732

(no title)

manishsharan | 2 months ago

I am curious : could GenAI have written the paper "Attention is all you need"? We were trapped in CNN RNN architectures for a while : could genAi have arrived at a better architecture ?

discuss

order

Antibabelic|2 months ago

I'm yet to see a convincing example of LLMs producing anything substantially insightful.

theshrike79|2 months ago

Depends on how you define "insight" really.

Is doing meta-analysis and discovering a commonality "insightful" for example?

Or is insight only something new you discover without basing your discovery on anything?

survirtual|2 months ago

No, it couldn't have unless these ideas were sandwiched between other ideas that it could interpolate between.

You have to approach genai as a high-dimensional interpolation machine. It can perform extrapolation when you, the user, provide enough information to operate on. It can interpolate between what you provide and what it knows as well.

With these constraints, it is still pretty powerful, and I am generalizing of course. But in my experience, it is terrible at truly novel implementations of anything. It makes countless mistakes, because it continually attempts to fit to patterns found in existing code.

So you can really see the weaknesses at the frontier. I would encourage experimenting there to confirm what I am saying.