top | item 46587623

(no title)

leobg | 1 month ago

So I guess in the 90s they would’ve sued Adobe for not putting spyware into Photoshop?

If you believe in democracy, and the rule of law, and citizenship, then the responsibility obviously lies with people who create and publish pictures, not the makers of tools.

Think of it. You can use a phone camera to produce illegal pictures. What kind of a world would we live in if Apple was required to run an AI filter on your pics to determine whether they comply with the laws?

A different question is if X actually hosts generated pictures that are illegal in the UK. In that case, X acts as a publisher, and you can sue them along with the creator for removal.

discuss

order

Symbiote|1 month ago

Photoshop does have (since the late 1990s or so) algorithms to detect and prevent editing images of currency.

The power of the AI tools is so great in comparison to a non-AI image editor that there's probably debate on who -- the user, or the operator of the AI -- is creating the image.

https://en.wikipedia.org/wiki/EURion_constellation

chrisjj|1 month ago

> The power of the AI tools is so great in comparison to a non-AI image editor that there's probably debate on who -- the user, or the operator of the AI -- is creating the image.

Compute power is irrelevent. What's relevant in law is who is causing the generation, and that's obviously the operator.

graemep|1 month ago

There is a big difference between running spyware on things running locally, and monitoring how people use a service running on your own computers. The former means you have to exfiltrate data, the latter is monitoring data you already have.

Photoshop in the 90s was the former, Grok is the latter.

SteveMqz|1 month ago

Apple does run software for detecting CSAM on pictures users store to the cloud.

chrisjj|1 month ago

That's to ensure Apple compliance, not user compliance.