AIs that were trained on data obtained through naughty channels actively avoid citing sources and full passages of reference text, otherwise they'd give the game away. This seems to increase the chance of them entirely hallucinating sources too.
Unfortunately, the citations are generally quite low quality and have in my experience a high rate of not actually supporting the text they're attached to.
In my experience they just add random links at the bottom that are often unrelated to the response they give; there’s absolutely no guarantee that they did read them or that their response is based on them.
Sometimes they hallucinate them, or if they exist, sources include blatant nonsense (like state owned propaganda, such as RT) / don't support the claims made by the output.
You're using the Research model that isn't available to Free users. As a pupil myself, I can vouch for the fact that nobody is using the Research models here.
Even if a pupil does pay, they will either be too lazy to wait the nearly 10 minutes it takes for the AI to do its research, or they actually care about getting good grades and therefore won't outsource their research to AI.
pardon_me|24 days ago
GorbachevyChase|24 days ago
i80and|24 days ago
hk__2|24 days ago
sofixa|24 days ago
vonneumannstan|24 days ago
https://chatgpt.com/share/6984c899-6cc4-8013-a8f6-ec204ee631...
regenschutz|24 days ago
Even if a pupil does pay, they will either be too lazy to wait the nearly 10 minutes it takes for the AI to do its research, or they actually care about getting good grades and therefore won't outsource their research to AI.