Except it's also trivial to buy or produce tables of pre-hashed emails, so this cloak of "oh we don't know who you are, it's a hash!" is usually just lipservice.
They're not literally passing around the hash. Holders of hash(email) <=> browser cookie associations are heavily incentivized for both regulatory and also competitive reasons to not blast that information around the internet -- or even to let direct partners A & B identify overlaps without their being in the middle.
When passing identifiers, there's generally some combination of lookup tables, per-distribution salted hashes, or encryption happening to make reverse mapping as difficult as possible.
This is one of the things that drives me nuts when hardcore privacy advocates start wading into browser feature discussions and complaining about things being used to fingerprint users.
I mean, can eye-tracking in a WebXR session be used to identify users? Yes, clearly that is a possibility. But will the addition of eye-tracking increase the identifiability of users? No, not in the least, because users are already identifiable by means that involve core browser features.
But frequently, the "privacy advocates" win and we're left with a web platform that has a lot of weird, missing functionally in comparison to native apps, pushing developers to either compromise on functionality or develop a native app. Compromising is bad for users. And developing a native app could can be bad for the developer, if one considers their existing investment in web technologies. Or both the developer and users, when one considers the vig that app stores charge, or the editorial control that app stores enforce over socially-controversial-yet-not-actually-illegal topics. Or just users when one considers the fact that the app stores just hand app developers a user identity without even making them work for it with fingerprinting.
And often, the voices that are loudest in defence of "privacy" are browser developers that also just so happen to be employed by said app store vendors.
I think the idea is that you can generate the MD5 hash of all, say 8 letter, @gmail.com addresses trivially and since the email hashes used for targeting don’t have a salt, it’s a one time “expense” to build the reverse lookup table
jgraettinger1|2 years ago
When passing identifiers, there's generally some combination of lookup tables, per-distribution salted hashes, or encryption happening to make reverse mapping as difficult as possible.
(I was in this space up until a few years ago).
ethbr1|2 years ago
CaveTech|2 years ago
oooyay|2 years ago
moron4hire|2 years ago
I mean, can eye-tracking in a WebXR session be used to identify users? Yes, clearly that is a possibility. But will the addition of eye-tracking increase the identifiability of users? No, not in the least, because users are already identifiable by means that involve core browser features.
But frequently, the "privacy advocates" win and we're left with a web platform that has a lot of weird, missing functionally in comparison to native apps, pushing developers to either compromise on functionality or develop a native app. Compromising is bad for users. And developing a native app could can be bad for the developer, if one considers their existing investment in web technologies. Or both the developer and users, when one considers the vig that app stores charge, or the editorial control that app stores enforce over socially-controversial-yet-not-actually-illegal topics. Or just users when one considers the fact that the app stores just hand app developers a user identity without even making them work for it with fingerprinting.
And often, the voices that are loudest in defence of "privacy" are browser developers that also just so happen to be employed by said app store vendors.
hiatus|2 years ago
Wouldn't this require knowledge of the email beforehand?
mgillett54|2 years ago
unknown|2 years ago
[deleted]
unknown|2 years ago
[deleted]