Which is funny because if Wikipedia dies, who will continue providing updated training data to these models? It's a weird self-fulfilling prophecy that consolidates without financial replacement.
I sometimes use neural nets for obscure compound questions (with a so so efficiency), but I can't imagine using NN in place of Wikipedia. I go to Wikipedia to find factual information (by factual I don't mean guaranteed, I mean hard data - years, names, models etc.). How can anyone rely on a random text generator to get factual data?
>Genuinely don't know why anyone would use it when you have perplexity, gemini, chatGPT search, etc. at your disposal.
LLMs hallucinate/confabulate. I use Wikipedia to check source info and to find additional information. Of course there are more reliable sources than Wikipedia, but it's useful, still.
dheerajvs|1 year ago
And what did they get trained on in the first place?
notyourwork|1 year ago
Yizahi|1 year ago
Azerty9999|1 year ago
LLMs hallucinate/confabulate. I use Wikipedia to check source info and to find additional information. Of course there are more reliable sources than Wikipedia, but it's useful, still.
eviks|1 year ago