(no title)
swells34 | 1 year ago
At the end of the day, this is because it isn't "writing code" in the sense that you or I do. It is a fancy regurgitation engine, that will output bits of stuff it's seen before that seem related to your question. LLMs are incredibly good at this, but that it also why you can never trust their output.
kvgr|1 year ago