(no title)
rabbits77 | 1 year ago
Well, that's actually the problem. This current wave of AI is not "learning" anything really. An AI with any sort of generalizable reasoning ability would just need basic sources on programming syntax and semantics and figure the rest out on its own. Here, instead, we see the need to effectively memorize variations of the same thing, say, answers to related programming questions, so that they can be part of a intelligent sounding response.
I was dubious at the value of GenAI as a search tool at first, but now see that it's actually well suited for the role. These massive models are largely storing information in a compressed form and are great at retrieving and doing basic rewrites. The next evolution in Expert Systems I suppose, although lacking strong reasoning.
flatline|1 year ago
That is a completely unsupportable assertion.
rabbits77|1 year ago
Maxatar|1 year ago
Humans don't learn anything of substance just from being told the strict rules, we also learn from a wealth of examples expressed through a variety of means some of which is formal, some poetic, some even comedic.
Heck, we wouldn't even need Stack Overflow to begin with if we could learn things just from basic sources.
NateEag|1 year ago
Humans do this.
Not perfectly, no, but we do.
As the only general intelligences we know of so far, I'd say it's support for the assertion that an AI with general reasoning abilities wouldn't need SO or other examples to figure out how to do specific tasks.