They made C memory safe? This is a big thing to gloss over in a single paragraph. Does anyone have extra details on this?
> On devices with iOS 14 and iPadOS 14 or later, Apple modified the C compiler toolchain used to build the iBoot bootloader to improve its security. The modified toolchain implements code designed to prevent memory- and type-safety issues that are typically encountered in C programs. For example, it helps prevent most vulnerabilities in the
following classes:
> • Buffer overflows, by ensuring that all pointers carry bounds information that’s verified
when accessing memory
> • Heap exploitation, by separating heap data from its metadata and accurately detecting error conditions such as double free errors
> • Type confusion, by ensuring that all pointers carry runtime type information that’s verified during pointer cast operations
> • Type confusion caused by use after free errors, by segregating all dynamic memory allocations by static type
Sort of. From my understanding they’ve been heavily using clang with fbounds checks to insert checks into functions. I think there was work done to try to insert them into existing code as well. They memory tagging in new processors help avoid overflow exploitation. Maybe someone can jump in and add more details
Apple's commitment to privacy and security is really cool to see. It's also an amazing strategic play that they are uniquely in the position to take advantage of. Google and Meta can't commit to privacy because they need to show you ads, whereas Apple feels more like a hardware company to me.
1. Google defaults to encrypted backups of messages, as well as e2e encryption of messages.
2. Apple defaults only to e2ee of messages, leaving a massive backdoor.
3. Closing that backdoor is possible for the consumer, by enabling ADP (advanced data protection) on your device. However, this makes no difference, since 99.9% of the people you communicate will not close the backdoor. Thus, the only way to live is to assume that all the messages you send via iMessage will always be accessible to Apple, no matter what you do.
It's not like overall I think Google is better for privacy than Apple, but this choice by Apple is really at odds with their supposed emphasis on privacy.
I still like to encourage people to watch all of https://www.youtube.com/watch?v=BLGFriOKz6U&t=1993s for the details (from Apple’s head of Security Engineering and Architecture) about how iCloud is protected by HSMs, rate limits, etc. but especially the timelinked section. :)
Can someone explain what the real difference is to a consumer user between an iPhone and a Pixel or a Samsung device? Across all services, push notifications, and device backups.
Both promise security, Apple promises some degree of privacy. Google stores your encryption keys, and so does Apple unless you opt in for ADP.
Is it similar to Facebook Messenger (encrypted in transit and at rest but Meta can read it) and Telegram (keys owned by Telegram unless you start a private chat)?
There are things Pixels do that iPhones don’t, e.g., you get notified when a local cell tower picks your IMEI. I mean it’s meaningless since they all do it, but you can also enable a higher level of security to avoid 2G. Not sure it’s meaningful but it’s a nice to have.
That people fall for this corporate BS while Tim Cook is giving gold bars to Trump and dining and dancing with him When people are being murdered on the streets by ice is just amazing to me.
I still like their hardware. But let’s not pretend that there is any part of Trump’s body that he won’t kiss and sell out his customers for. If Trump asked Cook to put a backdoor in iPhones or impose tariffs on Apple, Cook would do it in a minute
I claim bs at this whole apple privacy thing, nothing but propaganda.
Two years ago I was locked out of my MacBook pro.
Then I just booted in some recovery mode and just..reset the password!?
Sure macos logged me off from (most) apps and website, but every single file was there unencrypted!
I swear people that keep boasting that whole apple privacy thing have absolutely no clue what they are talking about, nothing short of tech illiterate charlatans. But God the propaganda works.
Apple has since confirmed in a statement provided to Ars that the US federal government “prohibited” the company “from sharing any information,” but now that Wyden has outed the feds, Apple has updated its transparency reporting and will “detail these kinds of requests” in a separate section on push notifications in its next report.
We know now that it was all marketing talk. Apple didn’t like Meta so they spun a bunch of obstacles. Apple has and would use your data for ads, models and anything that keeps the shareholders happy. And we don’t know the half of the story where as a US corp, they’re technically obliged to share data from the not-E2EE iCloud syncs of every iPhone.
It sucks that Apple decided to monitize iPhone the way they have, by controlling the owners ability to install software of their choosing. Ignoring the arguments one could make about this making it "more secure" it's clearly disrespectful to the power user that doesn't want to beg Apple's permission to use their computer. I'll grant them their security claims are sound, but it's hard to take them serious regarding privacy arguments.
Our choices are either (A) an OS monitized by tracking user interaction and activity, or (B) monitized by owning the basic act of installing software on the device, both of these options suck and I struggle to give up the more open option for one that might be more secure.
Ignoring the arguments one could make about this making it "more secure" it's clearly disrespectful to the power user that doesn't want to beg Apple's permission to use their computer. I'll grant them their security claims are sound,
I wouldn't say they are sound. First, macOS provides the freedom to install your own applications (ok, they need to be signed and notarized if the quarantine attribute is set) and it's not the case that the Mac has mass malware infestations. Second, the App Store is full of scams, so App Store - safe, external - unsafe is a false dichotomy.
Apple uses these arguments, but of course the real reason is that they want to continue to keep 30% of every transaction made on an iPhone or iPad. This is why they have responded to the DMA with a lot of malicious compliance that makes it nearly impossible to run an alt-store financially.
(Despite my qualms about not being able to install apps outside the app store, I do think they are doing a lot of good work of making the platform more secure.)
The OP is about security and you specifically ignore security when bringing up a common flamewar topic for which much discussion has already been had on this site. Perhaps such discussion could at least be limited to articles where it is less tenuously related.
You can request a downloadable a copy of any/all of the data that Apple has associated with your account at https://privacy.apple.com.
This apparently includes retrieving all photos from iCloud in chunks of specified size, which seems an infinitely better option than attempting to download them through the iCloud web interface which caps downloads to 1000 photos at a time at less than impressive download speeds.
Somehow, they conveniently forgot to mention these "security" features:
1. Constant popups about "application requesting access" on macOS. That often happens without any user's activity.
2. If you leave the permission popup open for some time (because it's on a different screen), it auto-denies. And then you won't be able to find ANY mention of it in the UI.
3. macOS developers can't be assed to fix mis-features, like inability to bind low ports to localhost without having root access (you can open any listening port on 0.0.0.0 but you can't open 127.0.0.1:80).
But all the software is closed source, and there is little to no opportunity to verify all these security claims. You don't have the encryption keys, so effectively the data is not under your control.
If you want to see security done well (or at least better), see the GrapheneOS project.
GrapheneOS also doesn't give you the encryption keys. If you run the official version, there is no way for you to extract the data from your device at all beyond what app developers will let you access. This means that you do not own the data on your device. The backups are even less effective than Apple's, although they say they will work on it.
The developers also appear to believe that the apps have a right to inspect the trustworthiness of the user's device, by offering to support apps that would trust their keys [1], locking out users who maintain their freedom by building their own forks.
It's disheartening that a lot of security-minded people seem to be fixated on the "AOSP security model", without realizing or ignoring the fact that a lot of that security is aimed at protecting the apps from the users, not the other way around. App sandboxing is great, but I should still be able to see the app data, even if via an inconvenient method such as the adb shell.
Given that A19 + M5 processors with MIE (EMTE) were only recently introduced, I wonder how extensively MacOS/iOS make use of the hardware features. Is it something that's going to take several years to see the benefit, or does MIE provide thorough protection today?
Apple’s implementation of MTE is relatively limited in scope compared to GrapheneOS (and even stock Android with advanced security enabled) as it’s hardware intensive and degrades performance. I imagine once things get fast enough we could see synchronous MTE enabled everywhere.
It is curious at the moment though that enabling something like Lockdown Mode doesn’t force MTE everywhere, which imo it should. I think the people who are willing to accept the compromises of enabling that would likely also be willing to tolerate the app crashes, worse performance etc that would come with globally enabled MTE.
I think all of the kernel allocators and most (?) system processes in iOS 26 have MIE enabled, as does libpas (the WebKit allocator), so it’s already doing quite a lot.
> Since 2012, Mac computers have implemented numerous technologies to protect DMA, resulting in the best and most comprehensive set of DMA protections on any PC.
Macs are PCs now? This coming directly from Apple is hilarious.
It's not really possible to make a direct comparison, given that a big chunk of the features are baked into the silicon, or are architecture-level choices.
The ones I remember most affecting performance were zeroing allocated memory and the Spectre/Meltdown fix. Also, the first launch of a new app is slow in order to check the signature. Whole disk encryption is pretty fast today, but probably is a bit slower than unencrypted. The original FileVault using disk images was even slower.
What is "Google Messages"? I can't count the number of articles people have written over time about how many first-party messaging apps Google themselves have put out (and then put down), not to mention what messaging apps get shoveled on by third-party android integrators.
> the main reason a message wouldn't be properly end-to-end encrypted in Google's Messages app is when communicating with an iPhone user, because Apple has dragged their feet on implementing RCS features in iMessage
(or with any other android user who isn't using a first-party device / isn't using this one app)
> [...] Android's equivalent cloud backup service has been properly end-to-end encrypted by default for many years. Meaning that you don't need to convince the whole world to turn on an optional feature before your backups can be fully protected.
You make it out to seem that it's impossible for Google to read your cloud backups, but the article you link to [0] earlier in your post says that "this passcode-protected key material is encrypted to a Titan security chip on our datacenter floor" (emphasis added). So they have your encrypted cloud backup, and the only way to get the key material to decrypt it is to get it from an HSM in their datacenter, every part of which and the access to which they control... sounds like it's not really any better than Apple, from what I'm reading here. Granted, that article is from 2018 and I certainly have not been keeping up on android things.
You can enable Advanced Data Protection to address that issue with iMessages.
Giving users an option between both paths is usually best. Most users care a lot more that they can’t restore a usable backup of their messages than they do that their messages are unreadable by the company storing them.
I used to work at a company where our products were built around encryption. Users here on HN are not the norm. You can’t trust that most users will save recovery codes, encryption seed phrases, etc in a manner that will be both available and usable when they need them, and then they tend to care a lot less about the privacy properties that provides and a lot more that they no longer have their messages with {deceased spouse, best friend, business partner, etc}.
This is your blog post, so I'll ask you a question. What are you trying to state in Belief #1? The message is unclear to me with how it's worded:
> In this table, in the "iCloud Backup (including device and Messages backup)" row, under "Standard data protection",
> the "Encryption" column reads "In transit & on server". Yes, this means that Apple can read all of your messages
> out of your iCloud backups.
In addition to the things you mentioned, there's certainly a possibility of Apple attaching a virtual "shadow" device to someone's Apple ID with something like a hide_from_customer type flag, so it would be invisible to the customer.
This shadow device would have it's own keys to read messages sent to your iCloud account. To my knowledge, there's nothing in the security model to prevent this.
Apple's head of SEAR (Security Engineering & Architecture) just gave the keynote at HEXACON, a conference attended by the companies who make Pegasus such as NSO Group.
That doesn't seem like avoiding the elephant in the room to me. It seems like very much acknowledging the issue and speaking on it head-on.
Pegasus isn't magic. It exploits security vulnerabilities just like everything else. Mitigating and fixing those vulnerabilities is a major part of this document.
Why? The obvious conclusion is that Apple is doing everything in its power to make the answer “no.”
You might as well enumerate all the viruses ever made on Windows, point to them, and then ask why Microsoft isn’t proving they’ve shut them all down yet in their documents.
promiseofbeans|29 days ago
> On devices with iOS 14 and iPadOS 14 or later, Apple modified the C compiler toolchain used to build the iBoot bootloader to improve its security. The modified toolchain implements code designed to prevent memory- and type-safety issues that are typically encountered in C programs. For example, it helps prevent most vulnerabilities in the following classes:
> • Buffer overflows, by ensuring that all pointers carry bounds information that’s verified when accessing memory
> • Heap exploitation, by separating heap data from its metadata and accurately detecting error conditions such as double free errors
> • Type confusion, by ensuring that all pointers carry runtime type information that’s verified during pointer cast operations
> • Type confusion caused by use after free errors, by segregating all dynamic memory allocations by static type
1over137|29 days ago
They made a dialect of C with bounds safety, see:
https://clang.llvm.org/docs/BoundsSafety.html#overview
bri3d|29 days ago
https://saaramar.github.io/iBoot_firebloom/
vsgherzi|29 days ago
pjmlp|29 days ago
wcfrobert|29 days ago
jtbayly|29 days ago
https://james.darpinian.com/blog/apple-imessage-encryption/
My current understanding of the facts:
1. Google defaults to encrypted backups of messages, as well as e2e encryption of messages.
2. Apple defaults only to e2ee of messages, leaving a massive backdoor.
3. Closing that backdoor is possible for the consumer, by enabling ADP (advanced data protection) on your device. However, this makes no difference, since 99.9% of the people you communicate will not close the backdoor. Thus, the only way to live is to assume that all the messages you send via iMessage will always be accessible to Apple, no matter what you do.
It's not like overall I think Google is better for privacy than Apple, but this choice by Apple is really at odds with their supposed emphasis on privacy.
eddyg|29 days ago
jmaker|29 days ago
Both promise security, Apple promises some degree of privacy. Google stores your encryption keys, and so does Apple unless you opt in for ADP.
Is it similar to Facebook Messenger (encrypted in transit and at rest but Meta can read it) and Telegram (keys owned by Telegram unless you start a private chat)?
There are things Pixels do that iPhones don’t, e.g., you get notified when a local cell tower picks your IMEI. I mean it’s meaningless since they all do it, but you can also enable a higher level of security to avoid 2G. Not sure it’s meaningful but it’s a nice to have.
derbOac|29 days ago
As was demonstrated in LA, it's starting to have significant civil rights consequences.
ioasuncvinvaer|29 days ago
vrosas|29 days ago
[deleted]
Noaidi|29 days ago
raw_anon_1111|29 days ago
epolanski|29 days ago
Two years ago I was locked out of my MacBook pro.
Then I just booted in some recovery mode and just..reset the password!?
Sure macos logged me off from (most) apps and website, but every single file was there unencrypted!
I swear people that keep boasting that whole apple privacy thing have absolutely no clue what they are talking about, nothing short of tech illiterate charlatans. But God the propaganda works.
And don't start me on iMessage.
fsflover|29 days ago
dangus|29 days ago
bigyabai|29 days ago
isodev|29 days ago
We know now that it was all marketing talk. Apple didn’t like Meta so they spun a bunch of obstacles. Apple has and would use your data for ads, models and anything that keeps the shareholders happy. And we don’t know the half of the story where as a US corp, they’re technically obliged to share data from the not-E2EE iCloud syncs of every iPhone.
gumby271|29 days ago
Our choices are either (A) an OS monitized by tracking user interaction and activity, or (B) monitized by owning the basic act of installing software on the device, both of these options suck and I struggle to give up the more open option for one that might be more secure.
microtonal|29 days ago
I wouldn't say they are sound. First, macOS provides the freedom to install your own applications (ok, they need to be signed and notarized if the quarantine attribute is set) and it's not the case that the Mac has mass malware infestations. Second, the App Store is full of scams, so App Store - safe, external - unsafe is a false dichotomy.
Apple uses these arguments, but of course the real reason is that they want to continue to keep 30% of every transaction made on an iPhone or iPad. This is why they have responded to the DMA with a lot of malicious compliance that makes it nearly impossible to run an alt-store financially.
(Despite my qualms about not being able to install apps outside the app store, I do think they are doing a lot of good work of making the platform more secure.)
dan-robertson|29 days ago
willturman|29 days ago
This apparently includes retrieving all photos from iCloud in chunks of specified size, which seems an infinitely better option than attempting to download them through the iCloud web interface which caps downloads to 1000 photos at a time at less than impressive download speeds.
OGEnthusiast|1 month ago
easton|1 month ago
sonu27|29 days ago
cyberax|29 days ago
1. Constant popups about "application requesting access" on macOS. That often happens without any user's activity.
2. If you leave the permission popup open for some time (because it's on a different screen), it auto-denies. And then you won't be able to find ANY mention of it in the UI.
3. macOS developers can't be assed to fix mis-features, like inability to bind low ports to localhost without having root access (you can open any listening port on 0.0.0.0 but you can't open 127.0.0.1:80).
drnick1|29 days ago
If you want to see security done well (or at least better), see the GrapheneOS project.
digiown|29 days ago
The developers also appear to believe that the apps have a right to inspect the trustworthiness of the user's device, by offering to support apps that would trust their keys [1], locking out users who maintain their freedom by building their own forks.
It's disheartening that a lot of security-minded people seem to be fixated on the "AOSP security model", without realizing or ignoring the fact that a lot of that security is aimed at protecting the apps from the users, not the other way around. App sandboxing is great, but I should still be able to see the app data, even if via an inconvenient method such as the adb shell.
1. https://grapheneos.org/articles/attestation-compatibility-gu...
amelius|29 days ago
whitepoplar|1 month ago
armadyl|29 days ago
Apple’s implementation of MTE is relatively limited in scope compared to GrapheneOS (and even stock Android with advanced security enabled) as it’s hardware intensive and degrades performance. I imagine once things get fast enough we could see synchronous MTE enabled everywhere.
It is curious at the moment though that enabling something like Lockdown Mode doesn’t force MTE everywhere, which imo it should. I think the people who are willing to accept the compromises of enabling that would likely also be willing to tolerate the app crashes, worse performance etc that would come with globally enabled MTE.
bri3d|29 days ago
LoganDark|28 days ago
Macs are PCs now? This coming directly from Apple is hilarious.
rrgok|29 days ago
I would really like to see a benchmark with and without security measures.
Retr0id|29 days ago
relium|29 days ago
buildbot|1 month ago
zb3|29 days ago
modeless|1 month ago
[deleted]
philsnow|29 days ago
> the main reason a message wouldn't be properly end-to-end encrypted in Google's Messages app is when communicating with an iPhone user, because Apple has dragged their feet on implementing RCS features in iMessage
(or with any other android user who isn't using a first-party device / isn't using this one app)
> [...] Android's equivalent cloud backup service has been properly end-to-end encrypted by default for many years. Meaning that you don't need to convince the whole world to turn on an optional feature before your backups can be fully protected.
You make it out to seem that it's impossible for Google to read your cloud backups, but the article you link to [0] earlier in your post says that "this passcode-protected key material is encrypted to a Titan security chip on our datacenter floor" (emphasis added). So they have your encrypted cloud backup, and the only way to get the key material to decrypt it is to get it from an HSM in their datacenter, every part of which and the access to which they control... sounds like it's not really any better than Apple, from what I'm reading here. Granted, that article is from 2018 and I certainly have not been keeping up on android things.
[0] https://security.googleblog.com/2018/10/google-and-android-h...
TheNewsIsHere|29 days ago
Giving users an option between both paths is usually best. Most users care a lot more that they can’t restore a usable backup of their messages than they do that their messages are unreadable by the company storing them.
I used to work at a company where our products were built around encryption. Users here on HN are not the norm. You can’t trust that most users will save recovery codes, encryption seed phrases, etc in a manner that will be both available and usable when they need them, and then they tend to care a lot less about the privacy properties that provides and a lot more that they no longer have their messages with {deceased spouse, best friend, business partner, etc}.
runjake|1 month ago
This shadow device would have it's own keys to read messages sent to your iCloud account. To my knowledge, there's nothing in the security model to prevent this.
unknown|1 month ago
[deleted]
random_duck|1 month ago
varispeed|1 month ago
There is no point creating such document if elephant in the room is not addressed.
jesseendahl|29 days ago
That doesn't seem like avoiding the elephant in the room to me. It seems like very much acknowledging the issue and speaking on it head-on.
https://www.youtube.com/watch?v=Du8BbJg2Pj4
wat10000|1 month ago
gjsman-1000|1 month ago
You might as well enumerate all the viruses ever made on Windows, point to them, and then ask why Microsoft isn’t proving they’ve shut them all down yet in their documents.
unknown|1 month ago
[deleted]
Retr0id|1 month ago