(no title)
towledev | 7 months ago
They don't always listen.
Writing SQL, I'll give ChatGPT the schema for 5 different tables. It habitually generates solutions with columns that don't exist. So, naturally, I append, "By the way, TableA has no column FieldB." Then it just imagines a different one. Or, I'll say, "Do not generate a solution with any table-col pair not provided above." It doesn't listen to that at all.
CuriouslyC|7 months ago
ofjcihen|7 months ago
They can’t generate knowledge that isn’t in their corpus and the act of prompting (yes, even with agents ffs) is more akin to playing pachinko than it is pool?
ofjcihen|7 months ago
If you know what you’re doing and you’re trying to achieve something other than the same tutorials that have been pasted all over the internet the non-deterministic pattern machine is going to generate plausible bs.
They’ll tell you any number of things that you’re supposedly doing wrong without understanding what the machine is actually doing under the hood.