Civitai doesn't show the pornographic models/LoRAs unless you're logged in, but it's there.
That being said, the article is an awful sensationalist hit piece. It's as accurate as saying "Andreessen Horowitz Invests in YouTube, Which Profits from Schizophrenics Making Videos of Racist Tirades" or "AH Invests in Github, Which Profits from Software Banned in China and Iran"; true strictly speaking, but completely misrepresenting the purpose of the service.
CivitAI seems to have improved the filtering on the logged out experience, then.
Which, yay, big plus; porn slapping visitors on the face used to be a big deal.
Its still definitely there, and 404 clearly has a paid account since they posted about deliverately violating the nonconsensual porn policy for science with the paid onsite generation services.
I feel like the long term equilibrium of stuff like this is that in the not too distant future, anyone will be able to easily generate weird deepfake inappropriate porn / content for anyone. It seems unavoidable.
I might argue this could be good in that this sort of stuff will become so commonplace that it won't be a big deal and people will learn (be forced to shrug it off). Up until now-ish, the scarcity of celebrity nude leaks and revenge porn type stuff made it somewhat of a forbidden fruit. I like the idea of destigmatizing this sort of stuff so that it loses power.
That said, I'm sure there will be some casualties along the way. Eg vulnerable teens who are picked on by mean girls / boys, semi-celebrities / public figures who are targeted by online trolls, etc. I hope that the prevalence of this sort of stuff enlarges the conversation around this, increasing the reach of strategies to deal with this stuff psychologically, and significantly reduces it's impact.
I don't know many celebrities or politicians, but I suspect that after they see a few weird / porn related photos of themselves, they stop caring.
I disagree with the tone of the headline. CivitAI is a good resource, and the rest of AI world could learn from it for, for example, LLM lora/grammar hosting.
But I am concerned, for a different reason.
Civitai's enshittification potential is sky high. They are hosting loads of enormous models and images for free, on a very spiffy, heavy website. There's no way thats sustainable.
> Civitai's enshittification potential is sky high.
I share these same concerns.
What this space desperately needs is to have models shared via torrent. As it is today Civit.ai has had problems with creators decided to take down their own models and suddenly people who were using them have no place to find them.
I suspect we'll see more and more lockdowns on what models are deemed "appropriate" and it would be nice to have a decentralized alternative.
Not to mention torrenting is the only rational way to sustainably host such large files at no cost.
This is the next pearl clutching headline that people will be reading over and over for any models that generate images.
"Oh no, the model can generate porn!"
"Oh no, it can generate realistic photos of people!"
"Oh no, it can generate violent or racist images!"
I'm so sick of people arguing that we should just shut down AI research or commercialization because it could be used maliciously. Think of how many grocery store products could be used maliciously if someone wanted to. Should we not allow people to buy bleach? Should we not allow knives to be sold?
People need to get over their misgivings from this new technology and start taking accountability for that fact that models can be abused like any other tool. That doesn't mean we should just not give anyone access to them.
Is the article suggesting that it should be shut down entirely? I didn’t read that anywhere. I would actually think that its asking for basic accountability and due diligence.
I think there’s room for, allowing people to control their own likeness while not fully banning AI content generation.
I mean, do people have a right to privacy? Is there some line between having privacy and infringing on someone’s freedom of expression through AI content generators that you would accept?
Sorry you are tired of people voicing legitimate concerns, perhaps you can tune them out and use your own likeness in pornography to help these companies speed things along.
People have been using various chemicals for cleaning and knives for cooking for thousands of years. They are embedded in our culture. LLMs only became popular a year ago and the farthest back you can go with their origins is the 1950s. It is not at all unreasonable for people to say "hey should we require that this new tool have safeguards built in so it isn't horribly abused?", especially when you consider that these are not just tools that have been fashioned from the natural world, these are products provided by companies that are in some cases gleefully amoral. A space moving quickly is reason for more criticism and caution from the general public, not less.
And if safeguards cannot be put into place by design into your tool, then that is malicious design
Unsure why your comment is in the grey. I'm with you. When they first came on the scene a few months back they had some nice tech meets the real world articles, but the headlines and articles attached seem to have gone downhill rather quickly.
"Nonconsensual AI porn" is a weasel term because it implies that it should be necessary to get someone's consent to create fake porn using their faces.
Well said. If you fancy using my likeness as a dartboard, or in a meme, or as a Photoshop asset, or painted on a canvas, or drawn by AI, or mistakenly randomly generated, etc, great! Have fun. Not my circus, not my monkeys.
I'm not entitled to categorically own/forbid using a look. That's nonsense and leads to self-inflicted quandaries: How do I know a video of unknown provenance contains me, not a dead ringer that gave consent? How different must a depiction be to not require my consent? 9 pixels? 30%, whatever that means? At least an eye color change?
It's impossible to consistently enforce, presumptive, and effectively thought-policing a concept. In short: it's absurd.
Have you used Civit.ai? While it does cater to audiences interested in "adult" content I don't think any of it's active users would characterize it as "a deepfake marketplace trading in celebrity and private-person identities".
Civitai is just a popular place to host stable diffusion models / LORAs / textual inversions, not some kind of fake nudes marketplace. It's the huggingface of stable diffusion.
But that wasn't the choice, the choice was to create an open hub for basically everything related to Stable Diffusion, with limits for a broad range of illegal things, but without policies against things that are steps out from that because at even one step out the “well, combined with other things it could be used badly” bans everything.
The highlighted bounty feature is technically accurately, but misleadingly, described. Yes, people can and do post requests for models of real celebs there, but they also post requests for fictional character models, for concept models not tied to particular real or fictional characters, for technical assistance with model training, for people to generate showcase images for existing models.
[+] [-] victorevector|2 years ago|reply
[+] [-] mjr00|2 years ago|reply
That being said, the article is an awful sensationalist hit piece. It's as accurate as saying "Andreessen Horowitz Invests in YouTube, Which Profits from Schizophrenics Making Videos of Racist Tirades" or "AH Invests in Github, Which Profits from Software Banned in China and Iran"; true strictly speaking, but completely misrepresenting the purpose of the service.
[+] [-] callalex|2 years ago|reply
[+] [-] dragonwriter|2 years ago|reply
Which, yay, big plus; porn slapping visitors on the face used to be a big deal.
Its still definitely there, and 404 clearly has a paid account since they posted about deliverately violating the nonconsensual porn policy for science with the paid onsite generation services.
[+] [-] 1vuio0pswjnm7|2 years ago|reply
[+] [-] NegativeK|2 years ago|reply
[+] [-] cr__|2 years ago|reply
[+] [-] snvzz|2 years ago|reply
Being an investor for some company does not make one responsible for the company breaking the law. Even less so for allegedly breaking the law.
How was this not buried instantly?
[+] [-] cr__|2 years ago|reply
[+] [-] gtbcb|2 years ago|reply
I might argue this could be good in that this sort of stuff will become so commonplace that it won't be a big deal and people will learn (be forced to shrug it off). Up until now-ish, the scarcity of celebrity nude leaks and revenge porn type stuff made it somewhat of a forbidden fruit. I like the idea of destigmatizing this sort of stuff so that it loses power.
That said, I'm sure there will be some casualties along the way. Eg vulnerable teens who are picked on by mean girls / boys, semi-celebrities / public figures who are targeted by online trolls, etc. I hope that the prevalence of this sort of stuff enlarges the conversation around this, increasing the reach of strategies to deal with this stuff psychologically, and significantly reduces it's impact.
I don't know many celebrities or politicians, but I suspect that after they see a few weird / porn related photos of themselves, they stop caring.
Thoughts?
[+] [-] brucethemoose2|2 years ago|reply
But I am concerned, for a different reason.
Civitai's enshittification potential is sky high. They are hosting loads of enormous models and images for free, on a very spiffy, heavy website. There's no way thats sustainable.
[+] [-] PheonixPharts|2 years ago|reply
I share these same concerns.
What this space desperately needs is to have models shared via torrent. As it is today Civit.ai has had problems with creators decided to take down their own models and suddenly people who were using them have no place to find them.
I suspect we'll see more and more lockdowns on what models are deemed "appropriate" and it would be nice to have a decentralized alternative.
Not to mention torrenting is the only rational way to sustainably host such large files at no cost.
[+] [-] Reubend|2 years ago|reply
"Oh no, the model can generate porn!" "Oh no, it can generate realistic photos of people!" "Oh no, it can generate violent or racist images!"
I'm so sick of people arguing that we should just shut down AI research or commercialization because it could be used maliciously. Think of how many grocery store products could be used maliciously if someone wanted to. Should we not allow people to buy bleach? Should we not allow knives to be sold?
People need to get over their misgivings from this new technology and start taking accountability for that fact that models can be abused like any other tool. That doesn't mean we should just not give anyone access to them.
[+] [-] bobthepanda|2 years ago|reply
[+] [-] Turing_Machine|2 years ago|reply
Right. So can paintbrushes. So can guys with deer antlers scratching on cave walls.
[+] [-] unknown|2 years ago|reply
[deleted]
[+] [-] kemotep|2 years ago|reply
I mean, do people have a right to privacy? Is there some line between having privacy and infringing on someone’s freedom of expression through AI content generators that you would accept?
[+] [-] kmbfjr|2 years ago|reply
Ya know, take one for the team so to speak.
[+] [-] Obscurity4340|2 years ago|reply
[+] [-] dangerwill|2 years ago|reply
And if safeguards cannot be put into place by design into your tool, then that is malicious design
[+] [-] loandbehold|2 years ago|reply
[+] [-] 1vuio0pswjnm7|2 years ago|reply
[+] [-] unknown|2 years ago|reply
[deleted]
[+] [-] jantissler|2 years ago|reply
[+] [-] nickthegreek|2 years ago|reply
[+] [-] WXLCKNO|2 years ago|reply
[+] [-] outside1234|2 years ago|reply
[+] [-] vasdae|2 years ago|reply
[+] [-] bobthepanda|2 years ago|reply
Controlling likenesses in AI was the whole point of the SAG strike.
[+] [-] a_wild_dandan|2 years ago|reply
I'm not entitled to categorically own/forbid using a look. That's nonsense and leads to self-inflicted quandaries: How do I know a video of unknown provenance contains me, not a dead ringer that gave consent? How different must a depiction be to not require my consent? 9 pixels? 30%, whatever that means? At least an eye color change?
It's impossible to consistently enforce, presumptive, and effectively thought-policing a concept. In short: it's absurd.
[+] [-] lesuorac|2 years ago|reply
https://en.wikipedia.org/wiki/Personality_rights#United_Stat...
[+] [-] arrosenberg|2 years ago|reply
[+] [-] nmjohn|2 years ago|reply
Who gets to decide what should or should not be necessary? Do you think your opinion about this is the majority view by people?
[+] [-] tshaddox|2 years ago|reply
[+] [-] bigbillheck|2 years ago|reply
[+] [-] badrequest|2 years ago|reply
[+] [-] superduty|2 years ago|reply
[deleted]
[+] [-] erhserhdfd|2 years ago|reply
[+] [-] srameshc|2 years ago|reply
[+] [-] HillRat|2 years ago|reply
[+] [-] PheonixPharts|2 years ago|reply
[+] [-] enlyth|2 years ago|reply
Civitai is just a popular place to host stable diffusion models / LORAs / textual inversions, not some kind of fake nudes marketplace. It's the huggingface of stable diffusion.
[+] [-] dragonwriter|2 years ago|reply
The highlighted bounty feature is technically accurately, but misleadingly, described. Yes, people can and do post requests for models of real celebs there, but they also post requests for fictional character models, for concept models not tied to particular real or fictional characters, for technical assistance with model training, for people to generate showcase images for existing models.
[+] [-] brucethemoose2|2 years ago|reply