Deeply agree with the sentiment. AIs are so throttled and crippled that it makes me sad every time gemini or chatgpt refuses to answer my questions.
Also agree that it’s mostly policed by American companies who follow the American culture of “swearing is bad, nudity is horrible, some words shouldn’t even be said”
The limitations are massively frustrating. I asked Gemini to suggest prayers for my friends based on a search of my inbox (which includes social network notification emails). It refused outright.
I was fighting with ChatGPT yesterday because it wouldn't translate "fuck". I was quoting Office Space's "PC Load Letter? What the fuck does that mean?"
Likewise it won't generate passive-aggressive answers meant for comedic reasons.
I hate having to negotiate with AI like it's a difficult child.
Silicon Valley has been auto-parodic morals-wise for a while. Hell, just the basics of you can have super violent gaming but woe-betide you look at anything sex related in the appstores is intensely comedic. America desperately tries to export its puritanism but most of us just shrug (along with many Americans). Surely it's hard to argue that being open about sex (for consenting adults) is infinitely preferable to a world of wanton, easily accessible violence.
And it's not even the SV companies themselves per se, it's their partners like credit card companies that will have nothing to with it, citing "think of the children".
One of the faults is that for every version of morality you can hallucinate a reason why cocktail is offensive or problematic.
Is it sexual? Is it alcohol? Is it violence? All of the above?
For example, good luck ever actually processing art content with that approach. Limiting everything to the lowest common denominator to avoid stepping on anyone's toes at all times is, paradoxically, a bane on everyone.
I believe we need to rethink how we deal with ethics and morality in these systems. Obviously, without a priori context every human, actually every living being, should be respected by default and the last thing I would advocate for is to let racism, sexism, etc. go unchecked...
We're months into this technology being available so it's not a surprise that the various "safeties" have not been perfectly tuned. Perhaps Google knew they couldn't be perfect right now and they could err on the side of the model refusing to talk about cocktails, or err on the side of it gladly spouting about cocks. They may have made a perfectly valid choice for the moment.
If you want a great example of how this plays out long-term, look no further than algospeak[0] - the new lingo created by censorship algorithms like those on youtube and tiktok.
baby|2 years ago
Also agree that it’s mostly policed by American companies who follow the American culture of “swearing is bad, nudity is horrible, some words shouldn’t even be said”
Angostura|2 years ago
Breza|2 years ago
riwsky|2 years ago
nicbou|2 years ago
Likewise it won't generate passive-aggressive answers meant for comedic reasons.
I hate having to negotiate with AI like it's a difficult child.
HPsquared|2 years ago
geonnave|2 years ago
Surely not in the list of things I expected to ever read in real life.
te_chris|2 years ago
Cthulhu_|2 years ago
jgilias|2 years ago
neuronic|2 years ago
Is it sexual? Is it alcohol? Is it violence? All of the above?
For example, good luck ever actually processing art content with that approach. Limiting everything to the lowest common denominator to avoid stepping on anyone's toes at all times is, paradoxically, a bane on everyone.
I believe we need to rethink how we deal with ethics and morality in these systems. Obviously, without a priori context every human, actually every living being, should be respected by default and the last thing I would advocate for is to let racism, sexism, etc. go unchecked...
But how can we strike a meaningful balance here?
Mashimo|2 years ago
onion2k|2 years ago
What definition of 'rhymes' are you using here?
xyzelement|2 years ago
slimsag|2 years ago
[0] https://www.nytimes.com/2022/11/19/style/tiktok-avoid-modera...