Still waiting for Apple to provide end-to-end encryption on iCloud Backup for devices. Their documentation on this has always seemed intentionally vague.
End-to-end encrypted data ->
- Apple Card transactions (requires iOS 12.4 or later)
- Home data
- Health data (requires iOS 12 or later)
- iCloud Keychain (includes all of your saved accounts and passwords)
- Maps Favorites, Collections and search history (requires iOS 13 or later)
- Memoji (requires iOS 12.1 or later)
- Payment information
- QuickType Keyboard learned vocabulary (requires iOS 11 or later)
- Safari History and iCloud Tabs (requires iOS 13 or later)
- Screen Time
- Siri information
- Wi-Fi passwords
- W1 and H1 Bluetooth keys (requires iOS 13 or later)
They won't do this. Its their run-around to giving law enforcement access to the devices.
They can claim that the device is secure and always encrypted, and all the messaging is encrypted, and they don't collect user data. This is all true (i assume, but did not audit).
If you care about security, all you have to do is turn off iCloud backup, and everything is secure. If you don't care, well then you have a great feature.
They upload plain-text versions of messages, etc to iCloud so if law enforcement asks, they can still comply with the juicy data. They don't need to back-door the iphone for the Gov. which was a major PR issue a few years ago.
It's intentionally vague because they want people to read that page and think "oh, it's all encrypted, it's safe", and not realize that they intentionally preserve this backdoor so that they can provide data to the FBI at any time, with or without a warrant, at the FBI's explicit request:
Apple provided user data on over 30,000 users in 2019 to the US federal government without a warrant or probable cause, per Apple's own transparency report (see FISA orders). All the feds have to do is order the data from Apple, and they get all of it, on anyone they like.
You're going to be waiting a long time; it's a design goal for Apple (and by extension the feds) to be able to read your every stored text, iMessage, and iMessage attachment out of your device backup without your consent/knowledge.
It's not really that different from the situation in China, where Apple provides the same sort of backdoors to the CCP to be able to sell devices there. (There, the CCP requires that it be physically stored on state-owned and state-operated hardware, as I understand it.)
You can use clouds like these with your own cryptography software. A matter of using something standard while not giving the cloud provider your public key. As long as they allow you to specify the backup location (which I don't know if they do), this should be doable. If they don't allow this that is a more severe issue.
"For certain sensitive information, Apple uses end-to-end encryption" - there's a lot of important user generated data from Apple apps that is not end-to-end encrypted.
Frankly, I'd like to see them go even further and put in place a policy that all user-created-and-consumable content can only leave the device in end-to-end encrypted format and have those keys managed by my AppleID so not even Apple can decrypt.
They can introduce it at an API level without having to dictate storage providers. If a web-version of an app needs show my photos they can let the end-user browser decrypt it. This works for private data, 1:1 and 1:Many shared data.
I should have a choice with who hosts my encrypted data, who manages my keys/identity and who provides a service that uses that data. Let's get back to providing value through services and away from leaching value through hoarding data and controlling protocols.
Yes - this will force companies to change their business models if they rely on access to my data. Will it make for better software - Yes hands down. More companies can compete and we'll start to see more creative solutions.
I don't see anything about the "Unlock your iPhone with your Watch" feature that 14.5 is going to have[0] - i'd be interested in reading the in-depth security considerations they had. It's also currently a mystery if this feature does a partial Face ID scan in addition to requiring an unlocked Watch.
To enable the Unlock with Apple Watch feature, open the Settings app on your iPhone, then look for the “Face ID & Passcode” setting. Once you flip this toggle, your Apple Watch will be able to authenticate your iPhone as long as the following conditions are met:
- Face ID detects a mask
- Your Apple Watch is nearby
- Your Apple Watch is on your wrist
- Your Apple Watch is unlocked
- Your Apple Watch has a passcode enabled
It does not do a partial FaceID scan - I had a friend unlock it for me, and she's a different gender, 15 years younger and Asian. If it does, it's completely ineffectual.
It's nice to see that the Apple Security Research Device (i.e. the iPhone with root access) hasn't been forgotten about[0]. They even describe the additional security protections they had to do to ensure an attacker didn't give this device to someone that thought it was a regular iPhone (for example, the phone won't cold boot without being plugged into a charger, and if you plug it in, it shows the words "Security Research Device" before booting XNU in verbose mode)
I’m bummed as an admin that the new M1s remove a function as an admin I always loved with remote management.
From what a sales/dev person for a Saas MDM app for macOs told me, the M1s do not have a lock device feature. You can only wipe the device.
If an employee was terminated, we could remote send a lock command with a numeric code. The only way to remove the lock is to get the code from us or have Apple reset it in person. The in person visit you have to prove you’re the owner or have authorization from the company to have Apple unlock it.
My only option now is to wipe it. So now I have to find a cloud backup provider to back these devices up in case I need an important file from an employee who decides to go rogue.
I’d like to know how I’m still logged in in Twitch even after deleting and installing the app. Or how Spotify offered me to link it to an Alexia device I was setting up after I installed the Alexa app.
Twitch must have saved your login details/Tokens in Keychain. Unfortunately, unless the App deleted these entries from the Keychain, iOS does not delete this information upon app uninstall automatically. That is a way for Apps to check if User is installing app for the first time or not.
Coming to Alexa, it might be totally different approach, Ability to find the devices on your network and may be with a combination of bluetooth Beacons.
Fortunately, you need to install full app to read this information. Unlike a Facebook, Twitter or Google Analytics library(Framework) can track you across other apps with the same Library or Framework.
For Second One, with iOS14 Apple prompts a Privacy Alert for Connecting to Other devices on network, You can simply turn it off.
Detecting Alexa App on the device used to be possible before, but it does not go unnoticed by Apple these days without some co-ordination between Amazon and Spotify.
For the Twitch issue, it's likely that Twitch stored a secret in your Keychain that persists. If you have a Mac, you can enable iCloud Keychain on your devices to sync and explore the contents. Search for Twitch and delete the entry(ies).
The big one for me is ZFS. Mac has a fantastic ZFS port, and you're never going to run that in user space outside of some terribly crippled implementation.
What is really egregious is that apple still touts the T2 security benefits on their site and completely ignores the fact that it can be compromised. This in fact does make it harder to take Apple's hardware security claims at face value knowing what they know about T2 vs what they put out in their resources.
[+] [-] naturalpb|5 years ago|reply
https://support.apple.com/en-us/HT202303
End-to-end encrypted data -> - Apple Card transactions (requires iOS 12.4 or later) - Home data - Health data (requires iOS 12 or later) - iCloud Keychain (includes all of your saved accounts and passwords) - Maps Favorites, Collections and search history (requires iOS 13 or later) - Memoji (requires iOS 12.1 or later) - Payment information - QuickType Keyboard learned vocabulary (requires iOS 11 or later) - Safari History and iCloud Tabs (requires iOS 13 or later) - Screen Time - Siri information - Wi-Fi passwords - W1 and H1 Bluetooth keys (requires iOS 13 or later)
[+] [-] vineyardmike|5 years ago|reply
They can claim that the device is secure and always encrypted, and all the messaging is encrypted, and they don't collect user data. This is all true (i assume, but did not audit).
If you care about security, all you have to do is turn off iCloud backup, and everything is secure. If you don't care, well then you have a great feature.
They upload plain-text versions of messages, etc to iCloud so if law enforcement asks, they can still comply with the juicy data. They don't need to back-door the iphone for the Gov. which was a major PR issue a few years ago.
[+] [-] sneak|5 years ago|reply
https://www.reuters.com/article/us-apple-fbi-icloud-exclusiv...
Apple provided user data on over 30,000 users in 2019 to the US federal government without a warrant or probable cause, per Apple's own transparency report (see FISA orders). All the feds have to do is order the data from Apple, and they get all of it, on anyone they like.
You're going to be waiting a long time; it's a design goal for Apple (and by extension the feds) to be able to read your every stored text, iMessage, and iMessage attachment out of your device backup without your consent/knowledge.
It's not really that different from the situation in China, where Apple provides the same sort of backdoors to the CCP to be able to sell devices there. (There, the CCP requires that it be physically stored on state-owned and state-operated hardware, as I understand it.)
[+] [-] Fnoord|5 years ago|reply
[+] [-] someonehere|5 years ago|reply
[+] [-] saagarjha|5 years ago|reply
Some sort of “checked C” in iBoot: https://support.apple.com/guide/security/memory-safe-iboot-i...
Data is encrypted with your security policy, so if that changes (e.g. you disable SIP) it doesn’t expose it: https://support.apple.com/guide/security/sealed-key-protecti...
Details on what the SRD is and how it works: https://support.apple.com/guide/security/apple-security-rese...
[+] [-] Ennis|5 years ago|reply
Frankly, I'd like to see them go even further and put in place a policy that all user-created-and-consumable content can only leave the device in end-to-end encrypted format and have those keys managed by my AppleID so not even Apple can decrypt.
They can introduce it at an API level without having to dictate storage providers. If a web-version of an app needs show my photos they can let the end-user browser decrypt it. This works for private data, 1:1 and 1:Many shared data.
I should have a choice with who hosts my encrypted data, who manages my keys/identity and who provides a service that uses that data. Let's get back to providing value through services and away from leaching value through hoarding data and controlling protocols.
Yes - this will force companies to change their business models if they rely on access to my data. Will it make for better software - Yes hands down. More companies can compete and we'll start to see more creative solutions.
[+] [-] judge2020|5 years ago|reply
0: https://www.macrumors.com/2021/02/01/iphone-apple-watch-unlo...
[+] [-] rwc|5 years ago|reply
- Face ID detects a mask - Your Apple Watch is nearby - Your Apple Watch is on your wrist - Your Apple Watch is unlocked - Your Apple Watch has a passcode enabled
https://9to5mac.com/2021/02/04/iphone-face-id-unlock-apple-w...
[+] [-] kalleboo|5 years ago|reply
[+] [-] easton|5 years ago|reply
0: https://support.apple.com/guide/security/apple-security-rese...
[+] [-] Ducki|5 years ago|reply
[+] [-] someonehere|5 years ago|reply
From what a sales/dev person for a Saas MDM app for macOs told me, the M1s do not have a lock device feature. You can only wipe the device.
If an employee was terminated, we could remote send a lock command with a numeric code. The only way to remove the lock is to get the code from us or have Apple reset it in person. The in person visit you have to prove you’re the owner or have authorization from the company to have Apple unlock it.
My only option now is to wipe it. So now I have to find a cloud backup provider to back these devices up in case I need an important file from an employee who decides to go rogue.
[+] [-] johnwayne666|5 years ago|reply
[+] [-] ksearch|5 years ago|reply
Coming to Alexa, it might be totally different approach, Ability to find the devices on your network and may be with a combination of bluetooth Beacons.
Fortunately, you need to install full app to read this information. Unlike a Facebook, Twitter or Google Analytics library(Framework) can track you across other apps with the same Library or Framework.
For Second One, with iOS14 Apple prompts a Privacy Alert for Connecting to Other devices on network, You can simply turn it off.
Detecting Alexa App on the device used to be possible before, but it does not go unnoticed by Apple these days without some co-ordination between Amazon and Spotify.
[+] [-] naturalpb|5 years ago|reply
[+] [-] saagarjha|5 years ago|reply
[+] [-] m_eiman|5 years ago|reply
[+] [-] coldcode|5 years ago|reply
[+] [-] unknown|5 years ago|reply
[deleted]
[+] [-] Wowfunhappy|5 years ago|reply
[+] [-] qrbLPHiKpiux|5 years ago|reply
[+] [-] tumult|5 years ago|reply
I don't really know why anyone would take Apple's hardware security claims at face value after this.
edit: more links, though they're all pretty similar.
https://www.wired.com/story/apple-t2-chip-unfixable-flaw-jai...
https://appleinsider.com/articles/20/10/05/apples-mac-t2-chi...
https://www.zdnet.com/article/hackers-claim-they-can-now-jai...
https://www.theregister.com/2020/10/08/apple_t2_security_chi...
edit 2:
If this is wrong, I'd like to know the truth! Really! Was it a hoax? Is there a patch? What happened?
[+] [-] quit32|5 years ago|reply
[+] [-] unknown|5 years ago|reply
[deleted]
[+] [-] saagarjha|5 years ago|reply