top | item 46870826

(no title)

Altern4tiveAcc | 26 days ago

> Prosecutors say they are now investigating whether X has broken the law across multiple areas.

This step could come before a police raid.

This looks like plain political pressure. No lives were saved, and no crime was prevented by harassing local workers.

discuss

order

bawolff|26 days ago

> and no crime was prevented by harassing local workers.

Siezing records is usually a major step in an investigation. Its how you get evidence.

Sure it could just be harrasment, but this is also how normal police work looks. France has a reasonable judicial system so absent of other evidence i'm inclined to believe this was legit.

moolcool|26 days ago

> This looks like plain political pressure. No lives were saved, and no crime was prevented by harassing local workers.

The company made and released a tool with seemingly no guard-rails, which was used en masse to generate deepfakes and child pornography.

trhway|26 days ago

Internet routers, network cards, the computers, OS and various application software have no guardrails and is used for all the nefarious things. Why those companies aren't raided?

pdpi|26 days ago

I'm of two minds about this.

One the one hand, it seems "obvious" that Grok should somehow be legally required to have guardrails stopping it from producing kiddie porn.

On the other hand, it also seems "obvious" that laws forcing 3D printers to detect and block attempts to print firearms are patently bullshit.

The thing is, I'm not sure how I can reconcile those two seemingly-obvious statements in a principled manner.

ljsprague|26 days ago

No other "AI" companies released tools that could do the same?

cubefox|26 days ago

> The company made and released a tool with seemingly no guard-rails, which was used en masse to generate deepfakes and child pornography.

Do you have any evidence for that? As far as I can tell, this is false. The only thing I saw was Grok changing photos of adults into them wearing bikinis, which is far less bad.

317070|26 days ago

Well, there is evidence that this company made and distributed CSAM and pornographic deepfakes to make a profit. There is no evidence lacking there for the investigators.

So the question becomes if it was done knowingly or recklessly, hence a police raid for evidence.

See also [0] for a legal discussion in the German context.

[0] https://arxiv.org/html/2601.03788v1

skissane|26 days ago

> Well, there is evidence that this company made and distributed CSAM

I think one big issue with this statement – "CSAM" lacks a precise legal definition; the precise legal term(s) vary from country to country, with differing definitions. While sexual imagery of real minors is highly illegal everywhere, there's a whole lot of other material – textual stories, drawings, animation, AI-generated images of nonexistent minors – which can be extremely criminal on one side of an international border, de facto legal on the other.

And I'm not actually sure what the legal definition is in France; the relevant article of the French Penal Code 227-23 [0] seems superficially similar to the legal definition of "child pornography" in the United States (post-Ashcroft vs Free Speech Coalition), and so some–but (maybe) not all–of the "CSAM" Grok is accused of generating wouldn't actually fall under it. (But of course, I don't know how French courts interpret it, so maybe what it means in practice is something broader than my reading of the text suggests.)

And I think this is part of the issue – xAI's executives are likely focused on compliance with US law on these topics, less concerned with complying with non-US law, in spite of the fact that CSAM laws in much of the rest of the world are much broader than in the US. That's less of an issue for Anthropic/Google/OpenAI, since their executives don't have the same "anything that's legal" attitude which xAI often has. And, as I said – while that's undoubtedly true in general, I'm unsure to what extent it is actually true for France in particular.

[0] https://www.legifrance.gouv.fr/codes/section_lc/LEGITEXT0000...

orwin|26 days ago

France prosecutors use police raids way more than other western countries. Banks, political parties, ex-presidents, corporate HQs, worksites... Here, while white-collar crimes are punished as much as in the US (i.e very little), we do at least investigate them.

giancarlostoro|26 days ago

> This looks like plain political pressure. No lives were saved, and no crime was prevented by harassing local workers.

I wouldn't even consider this a reason if it wasn't for the fact that OpenAI and Google, and hell literally every image model out there all have the same "this guy edited this underage girls face into a bikini" problem (this was the most public example I've heard so I'm going with that as my example). People still jailbreak chatgpt, and they've poured how much money into that?

emsign|26 days ago

They've already broken the law by creating and hosting CSAM. Now let's see what else prosecutors will find.

bluegatty|26 days ago

No, that's not at all how this works.

They have a court order obviously to collect evidence.

You have offered zero evidence to indicate there is 'political pressure' and that statement by prosecutors doesn't hint at that.

'No crime was prevented by harassing workers' is essentially non sequitor in this context.

It could be that that this is political nonsense, but there would have to be more details.

These issues are really hard but we have to confront them. X can alter electoral outcomes. That's where we are at.

aaomidi|26 days ago

Lmao they literally made a broad accessible CSAM maker.

gadders|26 days ago

[deleted]

direwolf20|26 days ago

"The EU doesn't tolerate dissenting views."

The dissenting views: naked little kids

t0lo|26 days ago

[deleted]