(no title)
harpersealtako | 3 years ago
Most AI is currently done by big, "serious" companies that both care about liability and bad press of being associated with that sort of thing, and also have a lot of AI-ethicist-type folks on board who care a lot about what they consider "misuse". Some AI designs try to limit the NSFW training data used in the model (e.g. DALL-E 2), while others try to fine-tune or censor results after the fact (e.g. GPT-3).
Right now the adult-oriented AI applications lag slightly behind the most cutting-edge ones, but actually make up a shockingly big percentage of the consumer base, both current and potential -- adult content is probably one of the biggest actual potential applications for AI, and there are some really fascinating ethical questions around it (e.g. ethics of AI-generated porn vs real life porn, considerations around real people, minors, other illegal content, etc.).
Generally, adult-oriented models are either hobbyist clones/finetunes of existing models, or just existing models that people have figured out ways to get to work with adult inputs. There are plenty of AI model hosting services out there that have no qualms about being used for shady or even illegal purposes, so it's difficult if not impossible to stop it from the server provider side.
We need to be thinking more about what how we want to handle that sort of thing socially/culturally/legally, because it's gonna happen whether we want it to or not.
ALittleLight|3 years ago
harpersealtako|3 years ago