(no title)
moolcool | 26 days ago
The company made and released a tool with seemingly no guard-rails, which was used en masse to generate deepfakes and child pornography.
moolcool | 26 days ago
The company made and released a tool with seemingly no guard-rails, which was used en masse to generate deepfakes and child pornography.
trhway|26 days ago
sirnicolaz|26 days ago
bluescrn|25 days ago
This isn’t about AI or CSAM (Have we seen any other AI companies raided by governments for enabling creation of deepfakes, dangerous misinformation, illegal images, or for flagrant industrial-scale copyright infringement?)
protocolture|26 days ago
[deleted]
trothamel|26 days ago
pdpi|26 days ago
One the one hand, it seems "obvious" that Grok should somehow be legally required to have guardrails stopping it from producing kiddie porn.
On the other hand, it also seems "obvious" that laws forcing 3D printers to detect and block attempts to print firearms are patently bullshit.
The thing is, I'm not sure how I can reconcile those two seemingly-obvious statements in a principled manner.
_trampeltier|26 days ago
If you use a service like Grok, then you use somebody elses computer / things. X is the owner from computer that produced CP. So of course X is at least also a bit liable for producing CP.
ytpete|25 days ago
beAbU|25 days ago
Grok makes it trivial to create fake CSAM or other explicit images. Before, if someone spent a week on photoshop to do the same, It won't be Adobe that gets the blame.
Same for 3D printers. Before, anyone could make a gun provided they have the right tools (which is very expensive), now it's being argued that 3D printers are making this more accessible. Although I would argue it's always been easy to make a gun, all you need is a piece of pipe. So I don't entirely buy the moral panic against 3D printers.
Where that threshold lies I don't know. But I think that's the crux if it. Technology is making previously difficult things easier, to the benefit of all humanity. It's just unfortunate that some less-nice things have also been included.
watwut|25 days ago
muyuu|25 days ago
you cannot elaborately use a software to produce an effect that is patently illegal and accurate to your usage, and then pretend the software is to blame
ljsprague|25 days ago
hackinthebochs|25 days ago
gulfofamerica|26 days ago
[deleted]
ChrisGreenHeur|26 days ago
[deleted]
KaiserPro|25 days ago
cubefox|26 days ago
Do you have any evidence for that? As far as I can tell, this is false. The only thing I saw was Grok changing photos of adults into them wearing bikinis, which is far less bad.
klez|25 days ago
This is how it works, at least in civil law countries. If the prosecutor has reasonable suspicious that a crime is taking place they send the so-called "judiciary police" to gather evidence. If they find none (or they're inconclusive etc...) the charges are dropped, otherwise they ask the court to go to trial.
On some occasions I take on judiciary police duties for animal welfare. Just last week I participated in a raid. We were not there to arrest anyone, just to gather evidence so the prosecutor could decide whether to press charges and go to trial.
scott_w|26 days ago
For obvious reasons, decent people are not about to go out and try to general child sexual abuse material to prove a point to you, if that’s what you’re asking for.
numpad0|25 days ago