top | item 45803360

(no title)

jwitthuhn | 3 months ago

It is wrong to release something unreliable even while acknowledging it is unreliable? The product performs as advertised. If people want accurate information an LLM is the wrong tool for the job.

From the Gemma 3 readme on huggingface: "Models generate responses based on information they learned from their training datasets, but they are not knowledge bases. They may generate incorrect or outdated factual statements."

discuss

order

jqpabc123|3 months ago

If people want accurate information an LLM is the wrong tool for the job.

So these vendors spent lots of time and money training LLMs to answer questions that people should not ask --- but are allowed and encouraged to.

Nonsensical and unrealistic. I expect the courts will agree and hold the vendors liable.