(no title)
Wojtkie | 1 month ago
I see it with coworkers all the time. They'll ask ChatGPT to do an analysis and it'll output test results for a T-test. They don't know how to interpret it at all, and so it's ultimately meaningless to them. They're just using "stat sig" as a way to make a non-technical VP happy. In situations like this, I don't think a highly intelligent source, model or human, can make the recipient be more intelligent than they actually are.
No comments yet.