top | item 46945807

(no title)

haritha-j | 20 days ago

> and will see content filters for any content Discord detects as graphic or sensitive.

I didn't even realise discord scans all the images that i send and recieve.

discuss

order

pixl97|20 days ago

Really I've come to the conclusion that anything I send out of my LAN is probably kept on a server forever and ingested by LLMs, and indexed to be used against me in perpetuity at this point, regardless of what any terms or conditions of the site I'm using actually says.

kmfrk|20 days ago

Speaking of hosting, Discord used to be one of the biggest (inadvertent) image hosts, so they might have set up the system to reduce legal exposure than to monitor conversations per se.[1]

A lot of the internet broke the day they flipped that switch off.

Weren't external Tumblr hotlinks also a thing back in the day?

[1]: https://www.reddit.com/r/discordapp/comments/16uy0an/not_sur...

palata|20 days ago

To be fair, the terms and conditions probably say that they can do whatever they want with that data :-).

Gud|20 days ago

Don’t forget all the government creeps snooping on the wires.

jsheard|20 days ago

Pretty much every non-E2EE platform is scanning every uploaded image for CSAM at least, that's a baseline ass-covering measure.

mapt|20 days ago

And E2EE platforms like Mega are now being censored on some platforms specifically because they're E2EE, and so the name itself must be treated as CSAM.

As people who want to talk about words like "megabytes" or "megapixels" or "megaphones" or "Megaman" or "Megan" on Facebook are finding out.

lpcvoid|20 days ago

Well it's not E2EE, so what did you expect? Nothing you do on Discord is private, everything is screened, categorized and readable by third parties.

RegnisGnaw|20 days ago

They have to at least for CSAM.

palata|20 days ago

Everything that is not end-to-end encrypted understandably has to do it.