I don't understand how people could even argue that this is in any way acceptable. Fighting "bias" has become some boogyman and anything "non-white" is now beyond reproach. Shocking.
You get 4 images per time and are lucky to get one white person when asked for it, no other model has that issue. Other models has no problems generating black people either, so it isn't that other models only generates white people.
So either it isn't a technical issue or Google failed to solve a problem everyone else easily solved. The chances of this having nothing to do with DEI is basically 0.
Depending on how broadly you define it, something like 10-30% of the world's population is white. Africa is about 20% of the world population; Asia is 60% of it.
It was widely criticized back then: the fact that Google both brought it back and made it more prominent is weird. Notably, OpenAI's implementation is more scoped.
I dont think so. My boss wanted me to generate a birthday image for a co-worker of a John Cena flyfishing. ChatGPT refused to do so. So I had to move to describing the type of person John Cena is instead of using his name. I kept giving me bearded people no matter what. I thought this would be the perfect time to try out Gemini for the first time. Well shit, It wont even give me a white guy. But all the black dudes are beardless.
It feels that the image generation it offers is perfect for some sort of a California-Corporate Style, e.g. you ask it for a "photo of people at the board room" or "people at the company cafeteria" and you get the corporate friendly ratio of colors, ability-levels, sizes etc. See Google's various image assets: https://www.google.com/about/careers/applications/ . It's great for coastal and urban marketing brochures.
But then then same California Corporate style makes no sense for historical images, so perhaps this is where Midjourney comes in.
Depending on what you ask for, it injects the word 'diverse' into the response description, so it's pretty obvious they're brute forcing diversity into it. E.g. "Generate me an image of a family" and you will get back "Here are some images of a diverse family".
yes, there's irrefutable evidence that models are wrangled into abiding the commissars' vision rather than just do their job and output the product of their training data.
It is possible Google tried to avoid likenesses of well known people by removing any image from the training data that contained a face and then including a controlled set of people images.
If you give a contractor a project that you want 200k images of people who are not famous, they will send teams to regions where you may only have to pay each person a few dollars to be photographed. Likely SE Asia and Africa.
sotasota|2 years ago
https://cdn.sanity.io/images/cjtc1tnd/production/912b6b5aacc...
https://pbs.twimg.com/media/GG1ThfsWUAAp-SO?format=jpg&name=...
https://cdn.sanity.io/images/cjtc1tnd/production/e2810c02ff6...
https://pbs.twimg.com/media/GG1MnepXwAAkPL6?format=jpg&name=...
https://pbs.twimg.com/media/GG0BLVsbMAARZXr?format=jpg&name=...
flumpcakes|2 years ago
gs17|2 years ago
Jensson|2 years ago
So either it isn't a technical issue or Google failed to solve a problem everyone else easily solved. The chances of this having nothing to do with DEI is basically 0.
ceejayoz|2 years ago
One in four sounds about right?
minimaxir|2 years ago
It was widely criticized back then: the fact that Google both brought it back and made it more prominent is weird. Notably, OpenAI's implementation is more scoped.
nickthegreek|2 years ago
update: google agrees there is an issue. https://news.ycombinator.com/item?id=39459270
8f2ab37a-ed6c|2 years ago
But then then same California Corporate style makes no sense for historical images, so perhaps this is where Midjourney comes in.
allmadhare22|2 years ago
123yawaworht456|2 years ago
https://cdn.openai.com/papers/DALL_E_3_System_Card.pdf
mike_d|2 years ago
If you give a contractor a project that you want 200k images of people who are not famous, they will send teams to regions where you may only have to pay each person a few dollars to be photographed. Likely SE Asia and Africa.