(no title)
lightandlight | 1 year ago
Yeah, I'd be more comfortable if it could say "I don't know" instead of generating bullshit.
Simon Willison's article Think of language models like ChatGPT as a “calculator for words”[1] helped me think about this topic better:
> If you ask an LLM a question, it will answer it—no matter what the question! Using them as an alternative to a search engine such as Google is one of the most obvious applications—and for a lot of queries this works just fine. It’s also going to quickly get you into trouble.
All of this is why I only every use LLMs for things that I can verify.
[1]: https://simonwillison.net/2023/Apr/2/calculator-for-words/
No comments yet.