ncw96's comments

ncw96 | 4 years ago | on: Show HN: Neural-hash-collider – Find target hash collisions for NeuralHash

Currently, most likely.

I don’t believe Apple has said whether or not they send them in their initial referral to NCMEC, but law enforcement could easily get a warrant for them. iCloud Photos are encrypted at rest, but Apple has the keys.

(Many have speculated that this CSAM local scanning feature is a precursor to Apple introducing full end-to-end encryption for all of iCloud. We’ll see.)

ncw96 | 4 years ago | on: Show HN: Neural-hash-collider – Find target hash collisions for NeuralHash

Apple has outlined[1] multiple levels of protection in place for this:

1. You have to reach a threshold of matches before your account is flagged.

2. Once the threshold is reached, the matched images are checked against a different perceptual hash algorithm on Apple servers. This means an adversarial image would have to trigger a collision on two distinct hashing algorithms.

3. If both hash algorithms show a match, then “visual derivative” (low-res versions) of the images are inspected by Apple to confirm they are CSAM.

Only after these three criteria are met is your account disabled and referred to NCMEC. NCMEC will then do their own review of the flagged images and refer to law enforcement if necessary.

[1]: https://www.apple.com/child-safety/pdf/Security_Threat_Model...

ncw96 | 4 years ago | on: Expanded Protections for Children

Sorry for the confusion — I was referring to just the CSAM hash feature that uploads results to iCloud.

There is also scanning for nudity in the Messages app, but those scans happen on-device and the photos stay on-device even if nudity is detected.

ncw96 | 4 years ago | on: Expanded Protections for Children

A summary of the photo scanning system:

- Only applies to photos uploaded to iCloud

- Matching against a known set of CSAM (Child Sexual Abuse Material) hashes occurs on-device (as opposed to the on-server matching done by many other providers)

- Multiple matches (unspecified threshold) are required to trigger a manual review of matched photos and potential account suspension

ncw96 | 4 years ago | on: Eddy Cue wanted to bring iMessage to Android in 2013

There are some features for group messaging on iMessage that aren’t available in Group MMS. If you add a non-iMessage user to your group, the group downgrades to using Group MMS, which does still work for basic messaging, but the group loses all of its iMessage-exclusive features.

ncw96 | 4 years ago | on: Ask HN: Why do iPad Pros with 1TB SSD and up have double the RAM

I think the really interesting question here is why does an iPad Pro have 16 GB of RAM at all?

The previous generation of iPad Pros maxed out at 6 GB, so this is quite a jump.

Adding Thunderbolt support is also a bit of a mystery.

I suspect this will all become clear at WWDC in June when Apple announces iPadOS 15.

ncw96 | 5 years ago | on: Librem Tunnel Is Leaving iOS

Apple’s in-app purchase/subscription rules only apply to digital goods — not things like banks, Uber, food delivery, etc.

ncw96 | 5 years ago | on: Librem Tunnel Is Leaving iOS

Apple has a well known exception to the rule requiring an option to subscribe inside the app for “reader” apps, a vaguely defined which includes Netflix, Spotify, Dropbox, and seemingly whatever else is convenient to Apple.

Apple did not lift this requirement for the Hey email app during the controversy over that last year. Instead, they reached a compromise where Hey would offer a free trial of their service inside the app, so they could be in compliance with the rule that an app must have some functionality without an account.

page 1