(no title)
grobgambit | 1 year ago
The problem to me is we are holding LLMs to a standard of usefulness from science fiction and not reality.
A new, giant set of encyclopedias has enormous utility but we wouldn't hold it against the encyclopedias that they aren't doing the thinking for us or 100% omniscient.
No comments yet.