(no title)
aantthony | 1 year ago
For example:
"A doctor was examining the patient when ___"
What this makes apparent is that increasing model temperature will select the less stereotypical option more often.
IMO this is getting at a deeper truth that the use of a gender in language, and historically defaulting to "he", was not about creating a bias, but instead it was a pattern which maximises information density and minimises useless information. Randomising the gender as is done today packs useless information into it.
superb_dev|1 year ago
aantthony|1 year ago
EForEndeavour|1 year ago
Where can I read more about this "truth"? Where is this assertion coming from that gendered pronouns developed to minimize useless information? It seems far more plausible to me that pervasive defaulting to male experiences caused many (certainly not all) human languages to (1) develop gendered pronouns and (2) default to the male pronoun.
aantthony|1 year ago
Choosing the more stereotypical option (even if it’s only 51%) is a more efficient encoding in an LLM model.