While I agree with your point somewhat, the scale is completely different. Its not a black and white thing. I know a lot of people will disagree with this statement though.
So, what happens when (not if) someone uses https://thispersondoesnotexist.com/ and gets children pictures, then generates them nude and doing sex acts? This is basically the same argument as above. The children aren't real, and the sex acts arent real. The bigger problem is "how do you tell if this image is a proof of a child rape or just AI generated?"
I'm going more and more towards "CSAM is evidence of a crime, and shouldn't be a crime in of itself" side of things.
However, even though I'd advocate in removing strict liability of possession of CSAM, I'm still for death penalty for any human who rapes children (we're talking under puberty, not 17yr olds etc).
Framed as crime scene evidence, it is kind of strange that simply possessing it, for any reason, is a serious crime worth decades in prison. The pedophiles are voluntarily taking pictures and videos of themselves committing crimes and sharing it on the internet. I am sure that if internet sleuths were allowed to investigate such material, like law enforcement is, many, many more child abusers would ultimately get caught. Distributing it should obviously still be illegal, but banning simple possession doesn't stop traffickers (obviously) and eliminates an enormous free workforce of amateur sleuths who I am sure would seriously help the problem. They have helped solved many cold case murders that police never could.
It's kind of creepy in a way when you think about it. No other crime scene evidence is guarded in that way. Is it illegal to possess photos/videos of someone getting stabbed, shot, beheaded, etc? Surely there are people who derive sexual pleasure from that material too, yet it proliferates and no one really cares. It's almost as if the powers that be REALLY don't want the general public to know who's behind some of this stuff. We already know very rich and powerful people are routinely involved in child sex rings...
And the AI image generation stuff makes it even more dangerous and ridiculous. So we're approaching a time when someone can just generate some images locally with stable diffusion or whatever of what looks like CSAM (but no actual children involved), get it into your possession (either remotely or just dropping an SD card in your house somewhere you don't see) and now you are liable for decades in prison...hopefully you don't have any enemies.
The case you cited is only federal law. Many states in the USA banned CGI depictions of CSAM as a result of cases like that, so computer-generated imagery, and child sex dolls are not permitted across a large section of the country, for better or worse. I have no opinion on it as I don't know if access to items such as these which involve no real children make the majority of those interested more or less likely to physically offend. I am sure for some it mitigates their desires, and for others it accentuates them. I am not a psychologist.
Tech workers need to unionize everywhere and create staunch opposition to this kind of garbage. We're the experts, we're the elite, we know what's best, and we have a responsibility to lead here. Doctors won't harvest organs to sell. Pharmacists won't make you poison. Engineers shouldn't make you encryption backdoors.
You're thinking of self-regulating professional body rather than a trade union. Unions fight for pay and working conditions against employers, professional bodies represent the considered opinion of the profession and self-regulate their members, like doctors and barristers.
this is absolutely not something a union should be taking a stance on. fear of a union getting bloated and losing focus on improving working conditions and pay is one of the biggest reasons tech workers i talk to aren't too jazzed about unions in the first place.
we don't even have one and already it's supposed to be opposed to some totally irrelevant legislation. i can see why people are hesitant to unionize
This has to be a joke, like a union would earnestly evangelize privacy and data security practices. It would, no doubt, be captured by activists who care more about DEI and non technical pursuits, and bloat into a bureaucratic monster overnight. A union's job isn't to set industry standards and best practices. And what would stop the companies from using consultants or third parties to implement these backdoors? Nothing. There will always be engineers willing to build things that are 'unethical' ... because money and technical curiosity.
Unions exist to represent and advance their members rights, which is good. They don't exist to pick up other social or political issues. When they try it is usually a mess because there is no good chance they will pick the "right" side, because it distracts from their actual mission and because it saps their support.
I'd love an union mouvement for IT workers. We have a small trade union in Switzerland that represents our interests but like for most other countries, engineers rarely see themselves as blue collars.
Yet another example of an insane, unworkable internet policy from the UK which will never actually be implemented. I don't know why they keep doing this.
Even if they were serious about it, they're a tiny country that lacks the clout to push around American tech giants. They lack the leverage of even Ireland, where most of these companies have their European tax base.
Quick note: the UK's current governing party has been in power for 13 years. They are completely out of ideas. The main opposition is similarly short on anything to actually offer beyond "the same as those guys, but better!". All the major issues facing the UK (debt, the economy, housing, education, brexit, democratic collapse) are issues where people are not willing to actually support action. So all there is left for parties is either dumb shit like this or fake wedge issues (like pretending trans people are pedophiles).
It seems the UK is good at self inflicting damage. First the brexit, now this online safety bill. Brexit had half the population in support while the safety bill didn't garner much attention from the public (or I've missed them). The pattern seems to be using a generic solution to specific problems while the blast radius of the generic solution is asymmetrically larger than the intended specific problems.
It seems like online safety bill is such a low-hanging fruit compared to building affordable housing so they concentrate on "achievable" results, even if these results are not really that important at all.
Apple is not to be trusted in terms of privacy. Nobody knows what their software is actually doing. It's all closed source. Their focus on privacy in their advertisements does not mean that their software actually protects users' privacy.
This is the same argument that ISPs should be liable for piracy.. and they're not because of common carrier status. From the early 2000s. Tech always finds a way to remain separate from it's delivered content.
I bet these revealing encryption policies will become implemented laws, 5 minutes after some research paper shows a better secret messaging concept is technically possible.
Big tech already knows more about kids than parents do.(1) If they want to protect kids on their platforms we don't even need to talk about end-to-end encryption yet. What would help is big tech sharing daily summaries of child on-device activity with parents. There are lots of parental control apps that do this, and yet they have had to fight Apple for permission to operate.(2) Parents don't want their kids abused either. We can definitely find a win-win situation here if we work at it.
I resent as immoral the argument that if one is for e2ee message and data encryption then you're in favor of CASM. Why I want my messages private is my own business but for GOVERNMENT OFFICIALS to lump me in with child predators who need to be summarily executed when caught is beyond wrong.
This is also a symptom of the laziness of police: rather than getting out and properly investigating crimes they would rather sit on their asses, eat doughnuts, drink coffee, and grep the personal data of all citizens.
tangential, but interesting how they choose to be loud about this, yet don't say or do anything about imessage downgrading to unencrypted sms with anything outside imessage. no warning about a less secure message being sent, no byline that'd be like 'unencrypted text' or like a tiny little open padlock or something, no effort to make messages encrypted where it could be possible, not even a 'snarky ad campaign' or something that'd point out that 'green bubbles? sms? that's insecure'. it's almost like... "encryptedness" and "privacy" isn't really the point. it's more of a 'marketing point' and 'ecosystem dynamics' - lock-in, network effect, and all that. actually implementing outside encrypted messages for their users? mmmeh.
My question is, if we all know now by now how privacy invasive these hardware/software vendors are, why do we still buy and support their products?
Is it really so hard for open source hardware to become a thing?
I'd trade in my iPhone for something built using RISC-V or MIPS ISA with all software written in Rust any day.
Lastly, why aren't open hardware satellites built so that people can pay for internet that doesn't route through U.S. Datacenters?
Very lastly, why not not build completely decentralized web apps were all your data is encrypted and distributed so that censorship isn't possible?
"Ohhhh no but then all the bad guys will use it." Yeah well you can't censor and monitor an entire planet just because of a couple bad seeds. The bad guys also use language as a form of communication. Will we also make words illegal in the future?
It's crazy, because Apple already maintains a backdoor in the e2ee crypto of iMessage which permits them to read all iMessages and attachments and scan them serverside for CSAM (if they desire).
iMessage endpoint sync keys (for "Messages in iCloud") are backed up in the non-e2ee iCloud Backup (on by default), allowing Apple to use those sync keys from the backup to decrypt iMessages and attachments in realtime.
Note that enabling e2ee iCloud Backups (a new feature) on your device won't close this hole as the conversation partner for each iMessage is also escrowing keys.
"Advanced Data Protection" is an optional feature you can enable for iCloud that makes that not true. See the table at https://support.apple.com/en-us/HT202303 and note that iCloud Backup (including device and Messages backup) stores the keys on your trusted devices only when Advanced Data Protection is enabled:
"Advanced Data Protection: iCloud Backup and everything inside it is end-to-end encrypted, including the Messages in iCloud encryption key."
These sorts of laws make Apple not able to support Advanced Data Protection in jurisdictions that pass them.
This is only the case if you’re using iCloud backup for your phone, although yes that’s kind of annoying - presumably this is a usability issue and there use.
Happily there’s now advanced data protection option - though you the docs go into great detail on the “if you forget everything all your data is gone forever” usability problem.
Apple has done a good job balancing security and usability here.
I can see e2ee being default/recommended for people with iCloud family accounts though as every family member can be part of the account recovery process.
[+] [-] ghughes|2 years ago|reply
And houses with walls should be banned unless the builder can guarantee no children will be harmed inside.
[+] [-] trinsic2|2 years ago|reply
It's a stupidly insane idea.
[+] [-] InCityDreams|2 years ago|reply
[+] [-] brnt|2 years ago|reply
[+] [-] amf12|2 years ago|reply
[+] [-] pierat|2 years ago|reply
And combined with this ruling about Japanese hentai with depiction of minors is legal to possess and sell: https://en.wikipedia.org/wiki/United_States_v._Handley
So, what happens when (not if) someone uses https://thispersondoesnotexist.com/ and gets children pictures, then generates them nude and doing sex acts? This is basically the same argument as above. The children aren't real, and the sex acts arent real. The bigger problem is "how do you tell if this image is a proof of a child rape or just AI generated?"
I'm going more and more towards "CSAM is evidence of a crime, and shouldn't be a crime in of itself" side of things.
However, even though I'd advocate in removing strict liability of possession of CSAM, I'm still for death penalty for any human who rapes children (we're talking under puberty, not 17yr olds etc).
[+] [-] buckleberry12|2 years ago|reply
It's kind of creepy in a way when you think about it. No other crime scene evidence is guarded in that way. Is it illegal to possess photos/videos of someone getting stabbed, shot, beheaded, etc? Surely there are people who derive sexual pleasure from that material too, yet it proliferates and no one really cares. It's almost as if the powers that be REALLY don't want the general public to know who's behind some of this stuff. We already know very rich and powerful people are routinely involved in child sex rings...
And the AI image generation stuff makes it even more dangerous and ridiculous. So we're approaching a time when someone can just generate some images locally with stable diffusion or whatever of what looks like CSAM (but no actual children involved), get it into your possession (either remotely or just dropping an SD card in your house somewhere you don't see) and now you are liable for decades in prison...hopefully you don't have any enemies.
[+] [-] qingcharles|2 years ago|reply
[+] [-] tennisflyi|2 years ago|reply
[+] [-] apexalpha|2 years ago|reply
Glad to see they now oppose it but it's hard to gauge how sincere it is given their history of trying to implement it themselves.
[+] [-] camgunz|2 years ago|reply
[+] [-] amelius|2 years ago|reply
Nice thought, except most of us are morally no better than expensive prostitutes.
> Doctors won't harvest organs to sell.
Many doctors take bribe money from Big Pharma. Also some doctors do crazy stuff like recommend surgeries to fund a new swimming pool.
[+] [-] cjs_ac|2 years ago|reply
[+] [-] JKCalhoun|2 years ago|reply
I certainly don't want this mantle.
Everyone wins if, in layman terms, you can instead make it clear to the non-technical people what the issue is.
[+] [-] iamabanana6|2 years ago|reply
Maybe about the technology and code. But definitely not about effects on society.
[+] [-] sbuk|2 years ago|reply
Wow. That is possibly the most "orange site" type post I have ever read here.
The minute you claim to be the elite is the minute the commons lose all trust in you. Appeal to authority is not the way to go here.
[+] [-] somsak2|2 years ago|reply
we don't even have one and already it's supposed to be opposed to some totally irrelevant legislation. i can see why people are hesitant to unionize
[+] [-] OldManRyan|2 years ago|reply
[+] [-] Eumenes|2 years ago|reply
[+] [-] LatteLazy|2 years ago|reply
[+] [-] derelicta|2 years ago|reply
[+] [-] hospitalJail|2 years ago|reply
Let me know where to sign up.
[+] [-] asimpletune|2 years ago|reply
[+] [-] Veen|2 years ago|reply
[+] [-] FpUser|2 years ago|reply
I did not know Narcissus still alive and hangs out here.
[+] [-] TillE|2 years ago|reply
Even if they were serious about it, they're a tiny country that lacks the clout to push around American tech giants. They lack the leverage of even Ireland, where most of these companies have their European tax base.
[+] [-] nologic01|2 years ago|reply
I guess since Orwell was British they think they are on top of this.
[+] [-] LatteLazy|2 years ago|reply
That is why this keep coming back.
[+] [-] a_c|2 years ago|reply
[+] [-] heavenlyblue|2 years ago|reply
PS: I am against online safety bill.
[+] [-] unknown|2 years ago|reply
[deleted]
[+] [-] soraminazuki|2 years ago|reply
[+] [-] okeuro49|2 years ago|reply
I voted remain, however I'm still a Eurosceptic. The two issues aren't comparable.
[+] [-] I_am_tiberius|2 years ago|reply
[+] [-] barrysteve|2 years ago|reply
I bet these revealing encryption policies will become implemented laws, 5 minutes after some research paper shows a better secret messaging concept is technically possible.
[+] [-] cwoolfe|2 years ago|reply
References: (1) https://www.forbes.com/sites/kashmirhill/2012/02/16/how-targ... (2) https://www.kaspersky.com/blog/apple-fas-complaint/26017/
[+] [-] Timber-6539|2 years ago|reply
[+] [-] secretsatan|2 years ago|reply
[+] [-] mikece|2 years ago|reply
This is also a symptom of the laziness of police: rather than getting out and properly investigating crimes they would rather sit on their asses, eat doughnuts, drink coffee, and grep the personal data of all citizens.
[+] [-] pxoe|2 years ago|reply
[+] [-] entriesfull|2 years ago|reply
Is it really so hard for open source hardware to become a thing?
I'd trade in my iPhone for something built using RISC-V or MIPS ISA with all software written in Rust any day.
Lastly, why aren't open hardware satellites built so that people can pay for internet that doesn't route through U.S. Datacenters?
Very lastly, why not not build completely decentralized web apps were all your data is encrypted and distributed so that censorship isn't possible?
"Ohhhh no but then all the bad guys will use it." Yeah well you can't censor and monitor an entire planet just because of a couple bad seeds. The bad guys also use language as a form of communication. Will we also make words illegal in the future?
[+] [-] gigel82|2 years ago|reply
[+] [-] sneak|2 years ago|reply
iMessage endpoint sync keys (for "Messages in iCloud") are backed up in the non-e2ee iCloud Backup (on by default), allowing Apple to use those sync keys from the backup to decrypt iMessages and attachments in realtime.
Note that enabling e2ee iCloud Backups (a new feature) on your device won't close this hole as the conversation partner for each iMessage is also escrowing keys.
I assume this is just brand marketing for Apple.
[+] [-] altano|2 years ago|reply
"Advanced Data Protection: iCloud Backup and everything inside it is end-to-end encrypted, including the Messages in iCloud encryption key."
These sorts of laws make Apple not able to support Advanced Data Protection in jurisdictions that pass them.
[+] [-] galad87|2 years ago|reply
[+] [-] olliej|2 years ago|reply
Happily there’s now advanced data protection option - though you the docs go into great detail on the “if you forget everything all your data is gone forever” usability problem.
[+] [-] aaomidi|2 years ago|reply
I can see e2ee being default/recommended for people with iCloud family accounts though as every family member can be part of the account recovery process.
[+] [-] LispSporks22|2 years ago|reply
[+] [-] cced|2 years ago|reply
Waiting for this to inevitably pass and for then to get to play the good guy card.
[+] [-] lngnmn2|2 years ago|reply
Oh, could it be that they store all the msgs as a plain text on the server and the "encrypted" is just a meme for only a TLS connection?
Rhetorical question, of course.