top | item 27819451

Downgrade User Agent Client Hints to 'harmful'

150 points| ronancremin | 4 years ago |github.com

111 comments

order

Ajedi32|4 years ago

> Moving stuff around (from User-Agent to Sec-CH-UA-*) doesn't really solve much. That is, having to request this information before getting it doesn't help if sites routinely request all of it.

I think this is sort of ignoring the whole point of the proposal. By making sites request this information rather than simply always sending it like the User-Agent header currently does, browsers gain the ability to deny excessively intrusive requests when they occur.

That is to say, "sites routinely request all of it" is precisely the problem this proposal is intended to solve.

There are some good points in this post about things which can be improved with specific Sec-CH-UA headers, but the overall position seems to be based on a failed understanding of the purpose of client hints.

Svip|4 years ago

> browsers gain the ability to deny excessively intrusive requests when they occur

But Set-Cookie kind of proves what happen to that kind of feature. If at first sites gets used to be able to request it and get it, then the browsers that deny anything will simply be ignored. And then those browsers will start providing everything, because they don't want to be left out in the cold.

That's what happened to User-Agent, that's what happened to Set-Cookie, and I can't see why it won't happen to Sec-CH-UA-*. Which the post hints at several times. Set-Cookie was supposed to have the browser ask the user to confirm whether they wanted to set a cookie. Not many clients doing that today.

To be honest, I feel the proposal is a bit naïve if it thinks that websites and all browsers will suddenly be on their best behaviour.

jefftk|4 years ago

Yes, I wish they would engage with how this fits into the rest of the Privacy Sandbox proposal (https://www.chromium.org/Home/chromium-privacy/privacy-sandb...). My understanding is it's:

1. Move entropy from "you get it by default" to "you have to ask for it".

2. Add new APIs that allow you to do things that previously exposed a lot of entropy in a more private way.

3. Add a budget for the total amount of entropy a site is allowed to get for a user, preventing identifying users across sites through fingerprinting.

Client hints are part of step #1. Not especially useful on its own, but when later combined with #3 sites now have a strong incentive to reduce what they ask for to just what they need.

(Disclosure: I work on ads at Google, speaking only for myself)

marcosdumay|4 years ago

Well, if the browsers can just deny those requests, then they can just drop the information entirely. (And they are dropping them from the UA.)

From the two non-harmful pieces, one is of interest of all sites, and the other one has the implementation broken on Chrome, so sites will have to use an alternative mechanism anyway. If there's any value on the idea, Google can propose them with a set of information that brings value, instead of just fingerprinting people.

jsbdk|4 years ago

>By making sites request this information rather than simply always sending it like the User-Agent header currently does, browsers gain the ability to deny excessively intrusive requests when they occur.

Browsers can just not send a UA header

grishka|4 years ago

Having to request it is a terrible idea to begin with. If I want to use different templates for mobile vs desktop, I need to know, on the backend, whether the device is a mobile device, and I need it on the very first request. Having to request these headers explicitly is an unnecessary complication that would slow down the first load.

However it is nice that there's now a separate header that gives a yes or no answer on whether it's a mobile device.

1vuio0pswjnm7|4 years ago

"By making sites request this information rather than simply sending it like the User-Agent header currently does..."

This is also true with respect to SNI which leaks the domain name in clear text on the wire. The popular browsers send it even when it is not required.

The forward proxy configuration I wrote distinguishes the sites (CDNs) that actually need SNI and the proxy only sends it when required. The majority of websites submitted to HN do not need it. I also require TLSv1.3 and strip out unecessary headers. It all works flawlessly with very few exceptions.

We could argue that sending so much unecessary information as popular browsers do when technically it is not necessary for the user is user hostile. It is one-sided. "Tech" companies and others interested in online advertising have been using this data to their advantage for decades.

csmpltn|4 years ago

> "User Agents MUST return the empty string for model if mobileness is false. User Agents MUST return the empty string for model even if mobileness is true, except on platforms where the model is typically exposed." (quoted from https://wicg.github.io/ua-client-hints/#user-agent-model)

Honestly now - who drafts and approves these specs? Not only does it make no sense whatsoever to encode such information this way - it also results in unimaginable amounts of bandwidth going to complete waste, on a planetary scale.

This is just plain incompetence. How did we let the technology powering the web devolve into this burning pile of nonsense?

dmitriid|4 years ago

Drafts: Google

Approves: no one.

Chrome just releases them in stable versions with little to no discussion, and the actual specs remain in draft stages.

Edit: grammar

joshuamorton|4 years ago

Why/how does this waste bandwidth? These are opt-in, so they are only sent if requested.

I mean sure http being plaintext is silly but that's not down to the authors of this particular rfc.

bzbarsky|4 years ago

The bar for creating a wicg draft is _very_ low. Things in that space are not "specs" that are "approved" in any way.

theandrewbailey|4 years ago

I would rather have all this information (along with whatever is being inferred from them) be exposed through a Javascript API instead of having browsers indiscriminately flood global networks with potential PII.

Chrome came up with this? Figures. Stay evil, Google.

esprehn|4 years ago

Can you explain the attack vector where encrypted HTTPS network traffic is vulnerable but a JS API isn't?

daveoc64|4 years ago

A JavaScript API has been considered as a replacement for the user agent string, but it has two big downsides:

1) JavaScript must be enabled. If it's not, then the server can't get any of the user agent data - at all.

2) The server won't get the user agent data until after it has already responded to the first request it receives from a client. That makes it a lot less useful overall. Having to load a page, then perhaps redirect the user using JS based on what the JS API says is a bit untidy.

admax88q|4 years ago

Serving different content for the same URI based upon various metadata fields in the request goes completely against the spirit of a URI.

hypertele-Xii|4 years ago

No it doesn't? Ever heard of Accept or Lang headers? Or cookies for that matter? Dynamic content?

ocdtrekkie|4 years ago

This is unfortunately the world of web apps, where a URI just gets you to the app, and the content within is dynamic.

justshowpost|4 years ago

> UA Client Hints proposes that information derived from the User Agent header field could only be sent to servers that specifically request that information, specifically to reduce the number of parties that can passively fingerprint users using that information. We find that the addition of new information about the UA, OS, and device to be harmful as it increases the information provided to sites for fingerprinting, without a commensurate improvements in functionality or accountability to justify that. In addition to not including this information, we would prefer freezing the User Agent string and only providing limited information via the proposed NavigatorUAData interface JS APIs. This would also allow us to audit the callers. At this time, freezing the User Agent string without any client hints (which is not this proposal) seems worth prototyping. We look forward to learning from other vendors who implement the "GREASE-like UA Strings" proposal and its effects on site compatibility.

https://mozilla.github.io/standards-positions/#ua-client-hin...

jrochkind1|4 years ago

I'm late to the ballgame, but what does "Sec-" mean as a HTTP header prefix anyway? I am failing at googling.

banana_giraffe|4 years ago

It means the browser is in control of the header, and not some script. From https://datatracker.ietf.org/doc/html/rfc8942 :

   Authors of new Client Hints are advised to carefully consider whether
   they need to be able to be added by client-side content (e.g.,
   scripts) or whether the Client Hints need to be exclusively set by
   the user agent.  In the latter case, the Sec- prefix on the header
   field name has the effect of preventing scripts and other application
   content from setting them in user agents.  Using the "Sec-" prefix
   signals to servers that the user agent -- and not application content
   -- generated the values.  See [FETCH] for more information.
As near as I can tell, the bit they're talking about in the Fetch standard is just this:

    These are forbidden so the user agent remains in full control over them. 
    Names starting with `Sec-` are reserved to allow new headers to be minted 
    that are safe from APIs using fetch that allow control over headers by 
    developers, such as XMLHttpRequest.

daveoc64|4 years ago

I hope they avoid situations like the SameSite=None debacle[0] if they are going to freeze the User Agent header and not provide an alternative.

The assertion of Mozilla seems to be:

>At the time sites deploy a workaround, they can’t necessarily know what future browser version won’t have the need for the workaround. Can we guarantee only retrospective use? Do Web developers care enough about retrospective workarounds for evergreen browsers?

When there are significant numbers of users on devices like iPads that don't get updated any more, you can't rely on "evergreen browsers".

[0] - https://www.chromium.org/updates/same-site/incompatible-clie...

fnord77|4 years ago

> Sec-CH-UA-Model provides a lot of identifying bits on Android and leads...

intentional?

mort96|4 years ago

Is there a typo or a pun or something I'm not seeing?

Knowing the exact make and model of an Android device is a lot higher entropy than knowing the exact make and model of an iPhone.

dmitriid|4 years ago

> I'm not sure why you used such an old Chrome version to test this.

That quote from the first comment on the issue is just a cherry on top.

Chrome 88 was released in December 2020. 7 months ago.

ThePadawan|4 years ago

I'm going to cut them some slack since December 2020 feels both 2 weeks and 4 years ago.

oefrha|4 years ago

Because when you’re implementing a new spec that is still in “draft” status and constantly being updated, things could have changed drastically in 7 months and 4 major versions?