(no title)
csomar | 8 days ago
This doesn't seem to work at all for stats-related apps/sites though, since you can't judge the accuracy of what's being presented. If the site claims it'll "take you to space," you don't take that literally, you just treat it as another AI artifact. But with numbers, you have no way to tell what's accurate and what's just made up.
sendkamal|7 days ago
mmooss|8 days ago
If you mean an LLM can be a brainstorming and hypothesis machine, and you have prior expertise to evaluate the proposals, then I can see that value. (Maybe that's what you meant, of course.)
But prior expertise is absolutely necessary. Otherwise we make ourselves victims of mis/disnformation. People say the Internet is a cesspool of mis/disinfo, yet nobody thinks it could affect them - we're all too smart, of course (no really, I'm the exception!). [0]
> This doesn't seem to work at all for stats-related apps/sites though, since you can't judge the accuracy of what's being presented.
I don't see the difference. If it's obvious nonsense, in numbers or in text, it's detectable. Everything else, see above.
[0] Research shows that thinking is a big reason people get fooled, and better educated people are easier to fool.
sendkamal|7 days ago