top | item 25095438

Does Apple really log every app you run? A technical look

621 points| jacopoj | 5 years ago |blog.jacopo.io | reply

344 comments

order
[+] the_duke|5 years ago|reply
While other posts on this topic are too alarmist, this one is way too Apple apologetic for my taste.

* There is no information on how often the validation happens. All this investigation concludes is that it doesn't happen when closing and immediately re-opening an app. Is it every week? Every reboot? Every hour? If it's less, that's essentially the same as doing it on every launch.

* There is no justification for sending this information in cleartext. I don't follow the "browsers and loops" argument. This is a system-service that only has to trust a special Apple certificate, which can be distributed via other side-channels.

* Many developers only publish a single app or a certain type of app. So it still is a significant information leak. It's really not much different from sending a app-specific hash. Think: remote therapy/healthcare apps, pornographic games, or Tor - which alone could get you into big trouble or on a watchlist in certain regions.

I assume they will push a fix with better timeouts and availability detection.

But Apple simply has to find a more privacy-aware system designs for this problem which does not leak this kind of data without an opt-in and also does not impact application startup times. (revocation lists?)

I imagine this data might just be too attractive not to have. Such a "lazy" design is hard to imagine coming out of Apple otherwise.

[+] WesolyKubeczek|5 years ago|reply
Most "alarmist" articles have two points you cannot really ignore, not if you don't want to end up living in interesting times one day.

1) Even plain access logs — basically what a HTTP request, or a TCP connection can tell you — is a lot. Gather those for a couple of days, and you have a good map of the user. More so if you have an ID of machine and the actual executable hash.

2) "But we are the good guys" is a non-defense. Good guys can turn bad, they can be coerced by the bad guys, and

3) since the requests fly out in plain text, there is an unknown number of questionably-aligned guys in between capable of sniff your data. You only need one bad enough guy to get into serious trouble if that's what they want.

This is not alarmist. It's just common sense. The same common sense that you use to avoid certain neighborhoods at certain times of night.

[+] _qulr|5 years ago|reply
> There is no information on how often the validation happens.

I wrote a blog post about this. My analysis indicates that Developer ID OCSP responses were previously cached for 5 minutes, but Apple changed it to half a day after Thursday's outage, probably to reduce traffic:

https://lapcatsoftware.com/articles/ocsp.html

[+] zamalek|5 years ago|reply
> [article] editing your /etc/hosts file. Personally, I wouldn’t suggest doing that as it prevents an important security feature from working.

Exactly the apologetic that you are talking about. Everyone has a different security update cadence (e.g. patch Tuesday for Microsoft), but each application launch is not a reasonable one. Given Apple's recent propensity for banning developers who stand against them (whether you agree with those developers or not), this is aimed squarely at dissent.

[+] lowendbeholder|5 years ago|reply
The loop argument makes no sense at all. HTTP is being used as a transport for a base64-encoded payload, the actual process of veryfing the validity of the developer certificate is done by the service behind that Apple URL - not by the HTTP stack.

There is no justification not to switch to HTTPS here.

[+] fiddlerwoaroof|5 years ago|reply
Isn’t OCSP an open standard for handling certificate revocations? The standard specifies plaintext, because the standard can’t assume that the client has a way to form an encrypted connection to the revocation list.
[+] dwild|5 years ago|reply
> Such a "lazy" design is hard to imagine coming out of Apple otherwise.

That's my biggest issue personally. There's a bit of information leak, but most wouldn't care and would just do the standard and be done with it. Firefox still uses OCSP in some case...

My issue is that a company like Apple, which currently market itself as a company that care about privacy of their user, would have let this comes out of that same process that's supposed to care... and still hasn't said that was a mistake out of their process and that they are correcting it.

They could easily use k-anonymity like HaveIBeenPwned, or even as push, which would means no cache, which is even better for their argument of security.

There's nothing alarmist here, it's all alright, it would just means that this is the same false advertising that so many companies do, but still, is important to be aware of.

[+] _57jb|5 years ago|reply
Agreed.

Call home features can be spoofed by a poisoning type of attack upstream in various forms.

This is not bullet proof and a cop-out with a poor solution for security.

You know who has effective call home features? Vendors that sell to major enterprises. It is a natural progression and a particularly nasty environment to live within.

If they are legitimately trying to protect the brand through force or merely forcefully controlling the app ecosystem... it's an abusive relationship to be in.

The fact this is not configurable without dead lettering the route is all they need to do to show tethering is something they consider as a viable security measure.

I'll pass.

[+] olliej|5 years ago|reply
I feel Apple has done privacy well in so many cases, that the way this works is really disappointing :-/
[+] feralimal|5 years ago|reply
> But Apple simply has to find a more privacy-aware system designs for this problem which does not leak this kind of data without an opt-in and also does not impact application startup times. (revocation lists?)

The idea that you need apple to certify the developer over the software you run on your phone is nonsense though. You don't do that on your computer, so why do you need to be nannied on your phone?

[+] topranks|5 years ago|reply
Clear-text is the OSCP mechanics. This is nothing to do with Apple or MacOS.

Potentially it could now be tackled with DNSSEC + DoH similar to the records ESNI/ECH puts in the DNS to encrypt initial HTTPS client hellos.

But the loop issue is quite real. How can you validate the certificate the OSCP server gives you has not been revoked, using OSCP???

[+] mrcybermac|5 years ago|reply
THANK YOU. I also see no reason that OCSP checks cannot support both HTTP and HTTPS. If there is some reason then the protocol should be split into two, one for unencrypted checks for things like SSL certs, and another for all other/ dev cert checks over HTTPS.
[+] znpy|5 years ago|reply
> this one is way too Apple apologetic for my taste.

I'm not surprised. Apple fanatics routinely deny evidence to support their sorta-religion.

[+] chrisoverzero|5 years ago|reply
> I don't follow the "browsers and loops" argument.

To log in to my banking account, I need the correct password. No problem, I keep it in a password manager. To open the password manager, I need the correct password. No problem, I keep it in a password manager. To open the password manager, I need the correct password. No problem, I keep it in a password manager. To open the password manager, I need the correct password. No problem, I keep it in a password manager. And so on.

Imagine that, but for “verifying the HTTPS connection”.

[+] throwawayg123|5 years ago|reply
Wait. Is it not common knowledge that Android and iOS log every application you open down to the exact millisecond you open and close them?

Is it not common knowledge how telemetry works for the operating systems? They generally batch up a bunch of logs like this, encrypt them, compress them, and then send them to the mothership (hopefully when you're on WiFi).

[+] ravenstine|5 years ago|reply
> macOS does actually send out some opaque information about the developer certificate of those apps, and that’s quite an important difference on a privacy perspective.

Yes, and no. If you're using software that the state deems to be subversive or "dangerous", a developer certificate would make the nature of the software you are running pretty clear. They don't have to know exactly which program you're running, but just enough information to put you on a list.

> You shouldn’t probably block ocsp.apple.com with Little Snitch or in your hosts file.

I never asked them to do that in the first place, so I'll be blocking it from now on.

[+] josephcsible|5 years ago|reply
> I never asked them to do that in the first place, so I'll be blocking it from now on.

Apple's working on making sure you can't block it. They already keep you from blocking their own traffic with Little Snitch and similar tools: https://news.ycombinator.com/item?id=24838816

[+] mkskm|5 years ago|reply
Privacy concerns aren’t the only reason to block it. It also makes software way more responsive. I was experiencing daily freezes that would disconnect my keyboard and mouse (particularly when waking the computer or connecting to an external display) on my 2020 MacBook Air before adding the entry to my hosts file which fixed the issue entirely. It was so pronounced and irreparable by Apple support technicians that I nearly ended up getting rid of the computer.
[+] Bondi_Blue|5 years ago|reply
Besides blocking from the hosts file, you can try:

    sudo defaults write /Library/Preferences/com.apple.security.revocation.plist OCSPStyle None
    
    sudo defaults write com.apple.security.revocation.plist OCSPStyle None
[+] jgilias|5 years ago|reply
So the takeaways are:

* Your Mac periodically sends plain text information about the developer of all apps you open, which in most cases makes it trivial for anyone able to listen to your traffic to figure out what apps you open. Better not use a Mac if you're a journalist working out of an oppressive country.

* Because of this Macs can be sluggish opening random applications.

* A Mac is not a general purpose computing device anymore. It's a device meant for running Apple sanctioned applications, much like a smartphone. Which may be fine, depends on the use case.

Yeah... No Mac for me anytime soon then.

[+] musicale|5 years ago|reply
> You should be aware that macOS might transmit some opaque information about the developer certificate of the apps you run. This information is sent out in clear text on your network.

Wow, that is bad from a privacy perspective!

Since certificate revocation is rare, it makes more sense to simply periodically update a list of revoked certificates instead of repeatedly checking each certificate. That would solve the privacy issue while still allowing certificates to be revoked.

OCSP seems like a bad idea for web browsing for similar reasons.

[+] dabeeeenster|5 years ago|reply
I don't quite understand why anyone would send data in clear text anymore, let alone Apple.
[+] t0astbread|5 years ago|reply
I was initially shocked by this as well so I did some more reading on OCSP and it seems this is being addressed through OCSP stapling.

According to Wikipedia "[OCSP stapling] allows the presenter of a certificate to bear the resource cost involved in providing Online Certificate Status Protocol (OCSP) responses by appending ("stapling") a time-stamped OCSP response signed by the CA to the initial TLS handshake, eliminating the need for clients to contact the CA, with the aim of improving both security and performance."

I'm not aware how widely deployed OCSP stapling is in reality. I looked at my Firefox settings which seemed to be the default for OCSP and it looked like this:

  security.OCSP.enabled                     1
  security.OCSP.require                     false
  security.OCSP.timeoutMilliseconds.hard    10000
  security.OCSP.timeoutMilliseconds.soft    2000
  security.ssl.enable_ocsp_must_staple      true
  security.ssl.enable_ocsp_stapling         true
So I assume OCSP stapling is enabled but direct OCSP is disabled in Firefox by default but a positive OCSP response is not required in general. I tried to check what was really happening with Wireshark but regardless of the configuration and sites I visited, I couldn't get Firefox to emit an OCSP query.

I also don't know what other TLS implementations (like OpenSSL) do and how users of such libraries usually configure them.

Addendum: Oh and of course, OCSP stapling is useless when you weren't about to open a TLS connection (like in this case when checking software signing certificates). I'm also curious if and how this works for other applications of X.509 certificates such as mutual TLS authentication.

[+] catlifeonmars|5 years ago|reply
The SLA for being made aware of revocations should be configurable from the client side. OCSP here would be fine if (a) it was sent over an encrypted connection using a preinstalled Apple root CA, and (b) the user could set the the TTL for caching the response. Larger developers (with more resources) could also feasibly implement something similar to OCSP stapling which has several desirable properties.
[+] izacus|5 years ago|reply
When it comes to these article, you should really apply the following "smell" test:

Replace "Apple" with "Google", "Facebook", "Verizon". Re-read the article. If it sounds horrifying, then it's also horrifying if Apple does it. There's no such thing as "trust" into a single corporation - especially the one which just argued that you not paying 30% to them is "theft".

Applying this test helps weed out the marketing bias these corpos constantly try to push at you.

[+] jrockway|5 years ago|reply
OCSP doesn't seem like the right protocol for this. Apple should probably just ship you a list of hashes of revoked certificates once a day, and should do the check locally. (Obviously, the global certificate database is too big to send to every user, but Apple should be able to determine the subset of certificates they trust, and the even smaller subset of those that are revoked or compromised.)

To me, it sounds like they decided to take the quick-and-easy path of reusing an existing protocol for the use case of stopping malware, but it doesn't really fit. The latency, privacy, and availability guarantees of OCSP just don't match with the requirements for "run a local application".

[+] sz4kerto|5 years ago|reply
Can someone explain my why is this significantly less problematic than sending out app hashes? If we accept that most developers don't have many similarly popular apps, then isn't this enough to infer what apps are users running?

In the example from the article: if Mozilla's certificate is sent, then it's very likely that the app that has been opened is Firefox, as the a priori likelihood of using Firefox is way higher than eg using Thunderbird.

If the developer is Telegram LLC, then ... and so on.

[+] readams|5 years ago|reply
It is only very very slightly less concerning than sending the app hashes. Coming to the conclusion that this is all great and fine is really absurd.
[+] banachtarski|5 years ago|reply
There will be a day when all apps on a mac will only be installable from the app store. Developers will be forced to buy macs and subscribe to Apple’s developer program to support it. Customers will be trained to not care. And HN Apple fanboys and fangirls will try to justify why this is a Good Thing(TM).
[+] cute_boi|5 years ago|reply
Clearly this article doesn't reveal every truth. Certificates authority should have been decentralized but is it happening?

And just by looking ip address, and app usage and other data they receive they can connect the data and identify its me. And what security has apple provided till now?

"You shouldn’t probably block ocsp.apple.com with Little Snitch or in your hosts file."

That's far better than freezing computer which doesn't work, doesn't run any apps. If I don't need apple mercy and protection please don't force me.

Already installed Linux and its a start.

[+] bitmunk|5 years ago|reply
Yeesh, "It's not THAT bad, it ONLY leaks the developer of every app you open, via cleartext. Oh, and it cripples your offline software when someone spills coffee over Apple's servers"

This is the reason people laugh at this website.

[+] vmateixeira|5 years ago|reply
So, not only apple, but pretty much everyone, can eavesdrop on the HTTP request and find out from which developer I'm running apps from?
[+] lifeisgood99|5 years ago|reply
Being able to identify the developer of any app I run on my own machine is already too far. You have to assume all these requests are logged and available for state actors on legal demand.

I wonder how big a local revocation list would be. I would support a on-by-default local check.

[+] neolog|5 years ago|reply
So Apple sends an app-developer identifier in clear text each time you open an app? That sounds really bad.
[+] paultopia|5 years ago|reply
Has anyone used a pi-hole to block apple privileged servers, like the OCSP one, while running Big Sur? I'm thinking of setting one up---not necessarily to block OCSP, because the points in this post about actually wanting to know when a certificate has been revoked are sensible---but to at least have the option in case of another disaster...

Relatedly, does anyone know if Big Sur allows one to use a custom DNS server on the device level with those privileged destinations? (He says, mulling the complexities of getting a pi-hole working with his mesh system.)

[+] ThePhysicist|5 years ago|reply
Not sure whether the non-privacy related aspect about OCSP is less worrying. Officially Apple does this to protect innocent users from malware, but as we've seen it also allows them to remotely disable any developers' software. Not really something that I'd want on my machine.
[+] RonanTheGrey|5 years ago|reply
I guess a super obvious question is, why do they do this instead of having a robust antivirus ecosystem?

I mean I guess I already know the answer, "marketing". "Look, macOS doesn't require antivirus!"

Personally I don't want Apple verifying or revoking anything. I bought the computer, it's mine. You don't get to tell me what I can run, period. Inform me, sure, give me links to go learn why you don't want me to run something, sure. Don't prevent me from choosing to do with my machine what I want.

[+] judge2020|5 years ago|reply
OCSP also allows CAs to revoke random websites’ certificates, yet nobody is making a big fuss about that (presumably because no OCSP server has encountered what Apple’s did and prevented websites from opening).
[+] cute_boi|5 years ago|reply
is there any statistics of how many innocent users have become victim? Clearly Apple just want control. Just like there is old saying More truth less trust is needed.
[+] arexxbifs|5 years ago|reply
ITT: Arguing what semantics to use when whitewashing a massive breach of trust, privacy and security with no officially solicited opt out.
[+] withinboredom|5 years ago|reply
Now just waiting for the trolls to write some software that makes the response always cause it to be invalid. With a wee bit of ARP magic, you could make a bunch of mac users very unhappy at the cafe's.
[+] sneak|5 years ago|reply
“It doesn’t send a hash of the app, it sends a thing that is a encoded hash that uniquely identifies the app! Totally different!”

It wasn’t a misunderstanding, it was a simplification so that people could understand the issue without me explaining OCSP and app signing and x509 and the PKI. Dozens of people wrote me to thank me for explaining it in a way that they could understand.

It is indeed a hash, and it does indeed uniquely identify most apps, and it is indeed sent in plaintext, when you launch the app (and is cached for a half day IIRC). I very deliberately didn’t claim it is a hash of the content of the app file.

It also doesn’t send a unique identifier, but I would be willing to wager that the set of apps that you launch in 48h is probably enough to uniquely identify your machine in the vast majority of cases.

[+] kzrdude|5 years ago|reply
Your text was understood that way because of something in the words you chose, maybe "hash of the application" for example.
[+] theodric|5 years ago|reply
By default, Android logs every app you use. You have to disable - bafflingly - features including saving locations in Google Maps and fully-functional voice recognition to (supposedly) disable that behavior. What I'm saying is: don't look so surprised.
[+] GekkePrutser|5 years ago|reply
True but Apple markets itself as a privacy-first company. Google doesn't.
[+] AntiImperialist|5 years ago|reply
"By default" is key here. Apple doesn't allow to change this at all, unless you do a hosts file hack.
[+] dpacmittal|5 years ago|reply
Why compare a phone OS which is much more tightly controlled to a desktop OS?
[+] lern_too_spel|5 years ago|reply
You seem to be confusing sharing "usage and diagnostics" with enabling location history.