> may include erotica, extreme gore, slurs, and unsolicited profanity
Why on Earth is porn grouped in with gore and slurs?!
What is with America's bizarre pretence that sex is something dirty? So dirty in fact, that it's in a category with swearing at people and slurs? And I say "pretence", because _my god_ you produce a lot of porn for a country that supposedly against that sort of thing.
Isn't the context of what you're quoting to do with the label "NSFW"? That label certainly has issues, but to be fair it's more of a "I don't want this on my screen at work" thing than a "this is dirty" thing. And from what I understand, that attitude isn't limited to America.
I'm really surprised they'd even say the 'p' word. It's truly a massive can of worms. Seems smarter to just nuke it completely and hope someone else tackles that mess.
Incorrectly handled, it could really topple of the open AI empire.
When the Christian right kept losing court cases over various laws based on christian moral values (examples: censorship laws and religious content in public school education), they figured out that they could just seek to influence every aspect of US society they could rather than fight court battles trying to make everything they don't like illegal. So they took up the "Seven Mountain Mandate." That includes banking and credit card processing.
Credit card companies and banks adopting Christian values into policy are why we see all the moralistic rules in the app stores, and why numerous websites have had to strictly curtail or outright remove 'adult' content.
It's also why porn actors and others in the industry are "unbankable" - many banks will close the accounts of someone they find out is active in the 'adult entertainment industry'.
It reminds me of rumors. Unverified information heard or received from another. How is this any different? Videos, photos, and sounds need to be verified, too. I remember back then when you looked at a photo and you could tell with 99% certainty it is legitimate though, but regardless, the deepfake videos do not have to be believed... or hearsay distrusted by people. We might just start defaulting to disbelief.
This is the killer app elephant in the room. Removing the exploitation of people in porn is a net positive. There’s been attempts at ethical porn but it is a garden in a rainforest.
Funny thing that you mention exploitation. From a high-level perspective, there's a lot of parallels in the discourse surrounding exploitation of women in pornography and exploitation of artists in generative AI. Mind you, I said that the discourse has parallels, we talk about these two things in similar ways -- I'll refrain from speaking on behalf of the actual actors whose experiences are distinct from my own (e.g.: artists, women, neural networks).
It's a pithy thought, but exactly what do I mean by that? Well... both things fundamentally exist within the interplay between labor, content, and the internet. We're forced to grapple with the yawning gap between abstract and economic value which the fundamental reproducibility of internet content implictly creates, not to mention the resulting potential for exploitation. This conflict forces us to reflect on unpleasant worldly realities -- how responsible is a consumer with finite resources for the fair treatment of those whose labors produce an infinite commodity?
Given that the AI in question is trained on the products of human exploitation (as you put it), is it accurate to say it’s been removed? The exploitation is laundered here and not truly removed, to my eyes.
(Intuitively it’s even worse in some ways, since performing in adult media doesn’t necessarily imply that the performer wants their physical attributes merged and re-represented in ways they would not have consented to.)
I agree with you (I also think gaming has allot of potential with this stuff), but personally I wouldn't want to feed my preferences for this sort of content into a hosted service.
I'd also imagine that there needs to some consideration applied in a legal context. Just yesterday there was a massive child porn bust which also included AI generated content[0].
Why does it need to be “responsible”? I feel these words are intentionally misleading about what they are - morality codes based on a small number of people’s opinions.
I can’t read the linked article (paywall) but others on the same topic say that OpenAI would like to explore supporting whatever is not illegal, but that deepfakes are out of the question. Are deepfakes illegal and should they really be? Deepfakes are no different from what goes on in people’s heads. Should the law really restrict that expression? I can see restricting passing off deepfakes as real to embarrass someone, but we already have laws for things like defamation. And if it isn’t illegal but just expression, should OpenAI support it?
[+] [-] ghusto|1 year ago|reply
Why on Earth is porn grouped in with gore and slurs?!
What is with America's bizarre pretence that sex is something dirty? So dirty in fact, that it's in a category with swearing at people and slurs? And I say "pretence", because _my god_ you produce a lot of porn for a country that supposedly against that sort of thing.
[+] [-] bitshiftfaced|1 year ago|reply
[+] [-] RunningDroid|1 year ago|reply
Because some fetishes involve one or more?
[+] [-] langsoul-com|1 year ago|reply
Incorrectly handled, it could really topple of the open AI empire.
[+] [-] KennyBlanken|1 year ago|reply
Credit card companies and banks adopting Christian values into policy are why we see all the moralistic rules in the app stores, and why numerous websites have had to strictly curtail or outright remove 'adult' content.
It's also why porn actors and others in the industry are "unbankable" - many banks will close the accounts of someone they find out is active in the 'adult entertainment industry'.
[+] [-] johnisgood|1 year ago|reply
[+] [-] throwawaysleep|1 year ago|reply
[+] [-] captaincaveman|1 year ago|reply
[+] [-] demondemidi|1 year ago|reply
[+] [-] chaorace|1 year ago|reply
It's a pithy thought, but exactly what do I mean by that? Well... both things fundamentally exist within the interplay between labor, content, and the internet. We're forced to grapple with the yawning gap between abstract and economic value which the fundamental reproducibility of internet content implictly creates, not to mention the resulting potential for exploitation. This conflict forces us to reflect on unpleasant worldly realities -- how responsible is a consumer with finite resources for the fair treatment of those whose labors produce an infinite commodity?
[+] [-] woodruffw|1 year ago|reply
(Intuitively it’s even worse in some ways, since performing in adult media doesn’t necessarily imply that the performer wants their physical attributes merged and re-represented in ways they would not have consented to.)
[+] [-] clwg|1 year ago|reply
I'd also imagine that there needs to some consideration applied in a legal context. Just yesterday there was a massive child porn bust which also included AI generated content[0].
[0] https://toronto.ctvnews.ca/ontario-provincial-police-arrest-...
[+] [-] negus|1 year ago|reply
[+] [-] tjpnz|1 year ago|reply
[+] [-] blackeyeblitzar|1 year ago|reply
I can’t read the linked article (paywall) but others on the same topic say that OpenAI would like to explore supporting whatever is not illegal, but that deepfakes are out of the question. Are deepfakes illegal and should they really be? Deepfakes are no different from what goes on in people’s heads. Should the law really restrict that expression? I can see restricting passing off deepfakes as real to embarrass someone, but we already have laws for things like defamation. And if it isn’t illegal but just expression, should OpenAI support it?
[+] [-] mcphage|1 year ago|reply
In this case specifically, the "small number of people" happen to be "the people who work at OpenAI".
[+] [-] unknown|1 year ago|reply
[deleted]
[+] [-] unknown|1 year ago|reply
[deleted]