(no title)
Altern4tiveAcc | 26 days ago
This step could come before a police raid.
This looks like plain political pressure. No lives were saved, and no crime was prevented by harassing local workers.
Altern4tiveAcc | 26 days ago
This step could come before a police raid.
This looks like plain political pressure. No lives were saved, and no crime was prevented by harassing local workers.
bawolff|26 days ago
Siezing records is usually a major step in an investigation. Its how you get evidence.
Sure it could just be harrasment, but this is also how normal police work looks. France has a reasonable judicial system so absent of other evidence i'm inclined to believe this was legit.
moolcool|26 days ago
The company made and released a tool with seemingly no guard-rails, which was used en masse to generate deepfakes and child pornography.
trhway|26 days ago
pdpi|26 days ago
One the one hand, it seems "obvious" that Grok should somehow be legally required to have guardrails stopping it from producing kiddie porn.
On the other hand, it also seems "obvious" that laws forcing 3D printers to detect and block attempts to print firearms are patently bullshit.
The thing is, I'm not sure how I can reconcile those two seemingly-obvious statements in a principled manner.
ljsprague|26 days ago
gulfofamerica|26 days ago
[deleted]
ChrisGreenHeur|26 days ago
[deleted]
cubefox|26 days ago
Do you have any evidence for that? As far as I can tell, this is false. The only thing I saw was Grok changing photos of adults into them wearing bikinis, which is far less bad.
317070|26 days ago
So the question becomes if it was done knowingly or recklessly, hence a police raid for evidence.
See also [0] for a legal discussion in the German context.
[0] https://arxiv.org/html/2601.03788v1
skissane|26 days ago
I think one big issue with this statement – "CSAM" lacks a precise legal definition; the precise legal term(s) vary from country to country, with differing definitions. While sexual imagery of real minors is highly illegal everywhere, there's a whole lot of other material – textual stories, drawings, animation, AI-generated images of nonexistent minors – which can be extremely criminal on one side of an international border, de facto legal on the other.
And I'm not actually sure what the legal definition is in France; the relevant article of the French Penal Code 227-23 [0] seems superficially similar to the legal definition of "child pornography" in the United States (post-Ashcroft vs Free Speech Coalition), and so some–but (maybe) not all–of the "CSAM" Grok is accused of generating wouldn't actually fall under it. (But of course, I don't know how French courts interpret it, so maybe what it means in practice is something broader than my reading of the text suggests.)
And I think this is part of the issue – xAI's executives are likely focused on compliance with US law on these topics, less concerned with complying with non-US law, in spite of the fact that CSAM laws in much of the rest of the world are much broader than in the US. That's less of an issue for Anthropic/Google/OpenAI, since their executives don't have the same "anything that's legal" attitude which xAI often has. And, as I said – while that's undoubtedly true in general, I'm unsure to what extent it is actually true for France in particular.
[0] https://www.legifrance.gouv.fr/codes/section_lc/LEGITEXT0000...
orwin|26 days ago
giancarlostoro|26 days ago
I wouldn't even consider this a reason if it wasn't for the fact that OpenAI and Google, and hell literally every image model out there all have the same "this guy edited this underage girls face into a bikini" problem (this was the most public example I've heard so I'm going with that as my example). People still jailbreak chatgpt, and they've poured how much money into that?
emsign|26 days ago
unknown|26 days ago
[deleted]
bluegatty|26 days ago
They have a court order obviously to collect evidence.
You have offered zero evidence to indicate there is 'political pressure' and that statement by prosecutors doesn't hint at that.
'No crime was prevented by harassing workers' is essentially non sequitor in this context.
It could be that that this is political nonsense, but there would have to be more details.
These issues are really hard but we have to confront them. X can alter electoral outcomes. That's where we are at.
aaomidi|26 days ago
Playboi_Carti|26 days ago
[deleted]
gadders|26 days ago
[deleted]
direwolf20|26 days ago
The dissenting views: naked little kids
t0lo|26 days ago
[deleted]