top | item 40544323

(no title)

he0001 | 1 year ago

When talking to LLMs their responses are always a bit off. The best way I can describe it, it’s like they are speaking a specific dialect but you know they are using expressions that wouldn’t be used in that dialect. Or, it’s like they are way over their head in the specific matter and reiterate things they really don’t understand. Like there’s no substance in what they are saying.

And if they do this for a simple or a matter that you know of, how can you trust their answers in a matter if you can’t evaluate their answer?

It doesn’t get better when they are totally mixing things up but say it with certainty, instead of just saying that they don’t know or admit that they have gaps in what they know. They don’t just get it wrong like humans do, it’s just weirdly wrong. Humans usually get it consistently wrong. Perhaps it’s about what they are not saying.

Would I always be able to tell it’s not a human I talk with, probably not. But chances are that I will know, the longer I talk to them.

discuss

order

No comments yet.