top | item 39632668

(no title)

r2b2 | 2 years ago

To create private shareable links, store the private part in the hash of the URL. The hash is not transmitted in DNS queries or HTTP requests.

Ex. When links.com?token=<secret> is visited, that link will be transmitted and potentially saved (search parameters included) by intermediaries like Cloud Flare.

Ex. When links.com#<secret> is visited, the hash portion will not leave the browser.

Note: It's often nice to work with data in the hash portion by encoding it as a URL Safe Base64 string. (aka. JS Object ↔ JSON String ↔ URL Safe Base 64 String).

discuss

order

jmholla|2 years ago

> Ex. When links.com?token=<secret> is visited, that link will be transmitted and potentially saved (search parameters included) by intermediaries like Cloud Flare.

Note: When over HTTPS, the parameter string (and path) is encrypted so the intermediaries in question need to be able to decrypt your traffic to read that secret.

Everything else is right. Just wanted to provide some nuance.

r2b2|2 years ago

Good to point out. This distinction is especially important to keep in mind when thinking about when and/or who terminates TLS/SSL for your service, and any relevant threat models the service might have for the portion of the HTTP request after terminattion.

mschuster91|2 years ago

Cloudflare, Akamai, AWS Cloudfront are all legitimate intermediaries.

phyzome|2 years ago

Huge qualifier: Even otherwise benign Javascript running on that page can pass the fragment anywhere on the internet. Putting stuff in the fragment helps, but it's not perfect. And I don't just mean this in an ideal sense -- I've actually seen private tokens leak from the fragment this way multiple times.

eadmund|2 years ago

Which is yet another reason to disable Javascript by default: it can see everything on the page, and do anything with it, to include sending everything to some random server somewhere.

I am not completely opposed to scripting web pages (it’s a useful capability), but the vast majority of web pages are just styled text and images: Javascript adds nothing but vulnerability.

It would be awesome if something like HTMX were baked into browsers, and if enabling Javascript were something a user would have to do manually when visiting a page — just like Flash and Java applets back in the day.

andix|2 years ago

Is there a feature of DNS I'm unaware of, that queries more than just the domain part? https://example.com?token=<secret> should only lead to a DNS query with "example.com".

erikerikson|2 years ago

The problem isn't DNS in GP. DNS will happily supply the IP address for a CDN. The HTTP[S] request will thereafter be sent by the caller to the CDN (in the case of CloudFlare, Akamai, etc.) where it will be handled and potentially logged before the result is retrieved from the cache or the configured origin (i.e. backing server).

r2b2|2 years ago

Correct, DNS only queries the hostname portion of the URL.

Maybe my attempt to be thorough – by making note of DNS along side HTTP since it's part of the browser ↔ network ↔ server request diagram – was too thorough.

klabb3|2 years ago

Thanks, finally some thoughts about how to solve the issue. In particular, email based login/account reset is the main important use case I can think of.

Do bots that follow links in emails (for whatever reason) execute JS? Is there a risk they activate the thing with a JS induced POST?

r2b2|2 years ago

To somewhat mitigate the link-loading bot issue, the link can land on a "confirm sign in" page with a button the user must click to trigger the POST request that completes authentication.

Another way to mitigate this issue is to store a secret in the browser that initiated the link-request (Ex. local storage). However, this can easily break in situations like private mode, where a new tab/window is opened without access to the same session storage.

An alternative to the in-browser-secret, is doing a browser fingerprint match. If the browser that opens the link doesn't match the fingerprint of the browser that requested the link, then fail authentication. This also has pitfalls.

Unfortunately, if your threat model requires blocking bots that click too, your likely stuck adding some semblance of a second factor (pin/password, bio metric, hardware key, etc.).

In any case, when using link-only authentication, best to at least put sensitive user operations (payments, PII, etc.) behind a second factor at the time of operation.

369548684892826|2 years ago

Yes, I've seen this bot JS problem, it does happen.

nightpool|2 years ago

The secret is still stored in the browser's history DB in this case, which may be unencrypted (I believe it is for Chrome on Windows last I checked). The cookie DB on the other hand I think is always encrypted using the OS's TPM so it's harder for malicious programs to crack

r2b2|2 years ago

Yes, adding max-use counts and expiration dates to links can mitigate against some browser-history snooping. However, if your browser history is compromised you probably have an even bigger problem...

eterm|2 years ago

If it doesn't leave the browser, how would the server know to serve the private content?

jadengeller|2 years ago

Client web app makes POST request. It leaves browser, but not in URL