top | item 45532287

(no title)

AlwaysRock | 4 months ago

Asking for a source from llms is so eye opening. I am yet to have them link a source that actually supports what they said.

discuss

order

willsmith72|4 months ago

> I am yet to have them link a source that actually supports what they said.

You're not trying very hard then. Here, my first try: https://claude.ai/share/ef7764d3-6c5c-4d1a-ba28-6d5218af16e0

kypro|4 months ago

But no one uses LLMs like this. This is the type of simple fact you could just Google and check yourself.

LLMs are useful for providing answers to more complex questions where some reasoning or integration of information is needed.

In these cases I mostly agree with the parent commenter. LLMs often come up with plausibly correct answers, then when you ask to cite sources they seem to just provide articles vaguely related to what they said. If you're lucky it might directly address what the LLM claimed.

I assume this is because what LLMs say is largely just made up, then when you ask for sources it has to retroactively try to find sources to justify what it said, and it often fails and just links something which could plausibly be a source to back up it's plausibly true claims.