(no title)
pelorat | 1 month ago
The person(s) ultimately in charge of removing (or preventing the implementation of) Grok guardrails might find themselves being criminally indicted in multiple European countries once investigations have concluded.
pelorat | 1 month ago
The person(s) ultimately in charge of removing (or preventing the implementation of) Grok guardrails might find themselves being criminally indicted in multiple European countries once investigations have concluded.
roywiggins|1 month ago
Suppose, if instead of an LLM, Grok was an X employee specifically employed to photoshop and post these photos as a service on request. Section 230 would obviously not immunize X for this!
KaiserPro|1 month ago
https://www.justice.gov/d9/2023-06/child_sexual_abuse_materi...
Generating a non-real child could be argued that it might not count. However thats not a given.
> The term “child pornography” is currently used in federal statutes and > is defined as any visual depiction of > sexually explicit conduct involving a > person less than 18 years old.
Is broad enough to cover anything obviously young.
but when it comes to "nude-ifing" a real image of a know minor, I strognly doubt you can use the defence its not a real child.
Therefore your knowingly generating and distributing CSAM, which is out of scope for section 230
paxys|1 month ago
tzs|1 month ago
They have something like Section 230 in the E-Commerce Directive 2000/31/EC, Articles 12-15, updated in the Digital Service Act. The particular protections for hosts are different but it is the same general idea.
lovich|1 month ago
They might just let this slide to not rock the boat, either out of fear and they will do nothing, or to buy time if they are actually divesting from the alliance with and economic dependence on the US
torlok|1 month ago