top | item 47115856

(no title)

woodruffw | 7 days ago

> LLMs don’t have such instincts and can potentially be instructed to present or evaluate the primary, if opposing, arguments.

It seems essentially wrong to anthropomorphize LLMs as having instincts or not. What they have is training, and there's currently no widely accepted test for determining whether a "fair" evaluation from an LLM stems from biases during training.

(It should be clear that humans don't need to be unpolitical; what they need to be is accountable. Wikipedia appears to be at least passably competent at making its human editors accountable to each other.)

discuss

order

kelipso|7 days ago

I said LLM doesn’t have such instinct but yeah I agree there should be less anthropomorphizing and more evaluation based framing when talking about LLMs, but it’s not that easy in regular discussions.

About Wikipedia, there is obvious bias and cliques there as has been discussed in this thread and HN for many years, not to mention the its bias is reason that Grokipedia came about in the first place.

croon|6 days ago

> not to mention the its bias is reason that Grokipedia came about in the first place.

claimed bias != bias

It may have bias, or it may not, but the only reason Grokipedia exists is because Musk doesn't like the contents of Wikipedia.

rbanffy|6 days ago

> bias is reason that Grokipedia came about in the first place.

You are correct, but only in the sense that Musk was unable to impose his own biases upon Wikipedia, so he had to make one where he can tune bias to whatever is convenient at the moment.